«

»

NVIDIA GeForce GTX 780 Ti Video Card Review

PAGE INDEX

<< PREVIOUS            NEXT >>

DX11: Metro 2033

Metro 2033 is an action-oriented video game with a combination of survival horror, and first-person shooter elements. The game is based on the novel Metro 2033 by Russian author Dmitry Glukhovsky. It was developed by 4A Games in Ukraine and released in March 2010 for Microsoft Windows. Metro 2033 uses the 4A game engine, developed by 4A Games. The 4A Engine supports DirectX-9, 10, and 11, along with NVIDIA PhysX and GeForce 3D Vision.

The 4A engine is multi-threaded in such that only PhysX had a dedicated thread, and uses a task-model without any pre-conditioning or pre/post-synchronizing, allowing tasks to be done in parallel. The 4A game engine can utilize a deferred shading pipeline, and uses tessellation for greater performance, and also has HDR (complete with blue shift), real-time reflections, color correction, film grain and noise, and the engine also supports multi-core rendering.

Metro 2033 featured superior volumetric fog, double PhysX precision, object blur, sub-surface scattering for skin shaders, parallax mapping on all surfaces and greater geometric detail with a less aggressive LODs. Using PhysX, the engine uses many features such as destructible environments, and cloth and water simulations, and particles that can be fully affected by environmental factors.

NVIDIA has been diligently working to promote Metro 2033, and for good reason: it’s one of the most demanding PC video games we’ve ever tested. When their flagship GeForce GTX 480 struggles to produce 27 FPS with DirectX-11 anti-aliasing turned two to its lowest setting, you know that only the strongest graphics processors will generate playable frame rates. All of our tests enable Advanced Depth of Field and Tessellation effects, but disable advanced PhysX options.

  • Metro 2033 Benchmark
    • Settings: Very-High Quality, 4x AA, 16x AF, Tessellation, PhysX Disabled

Metro-2033_DX11_Benchmark

Metro 2033 Benchmark Test Results

Graphics Card GeForce GTX580 Radeon R9 270X Radeon HD7950 GeForce GTX760 GeForce GTX680 Radeon HD7970 GeForce GTX780 Radeon R9 290X GeForce GTX780Ti
GPU Cores 512 1280 1792 1152 1536 2048 2304 2816 2880
Core Clock (MHz) 772 1030 850 980 1006 925 863 1000 876
Shader Clock (MHz) 1544 1120 Boost N/A 1033 Boost 1058 Boost N/A Boost 902 N/A Boost 928
Memory Clock (MHz) 1002 1400 1250 1502 1502 1375 1502 1250 1750
Memory Amount 1536MB GDDR5 2048MB GDDR5 3072MB GDDR5 2048MB GDDR5 2048MB GDDR5 3072MB GDDR5 3072MB GDDR5 4096MB GDDR5 3072MB GDDR5
Memory Interface 384-bit 256-bit 384-bit 256-bit 256-bit 384-bit 384-bit 512-bit 384-bit


SKIP TO PAGE:

<< PREVIOUS            NEXT >>

7 comments

Skip to comment form

  1. CrazyElf

    It’s competitive from a price performance standpoint (a first I suppose for high end GPUs).

    Hmm … it looks like:
    – It beats the Titan handily (unless the 3gb of VRAM runs out)
    – At lower resolutions, and more important at 2560×1440 it beats the 290X

    I wonder though how it will do against the 290X Crossfired at 4K if the 780Ti is in SLI?

  2. Caring1

    I’ve only read the first page and already it reads like a spiel from the Nvidia marketing division.
    I would have expected a more professional, independent approach to the review, but it seems their is a bias towards Nvidia products here.

    “delivers a host of additional features not seen or available from the competition. Ultra HD 4K resolution displays are supported, and so is the cutting-edge G-SYNC technology”

    So where is this host of features? Ultra HD is supported on the new AMD card, G-Sync is a Nvidia product not applicable to AMD cards. That makes one feature so far…
    Can we please try to be professional when reviewing?

    This type of B.S. isn’t needed or necessarily true: “NVIDIA tends to dominate the field when it comes to graphics processing power, leaving AMD scrambling to remain competitive by reducing prices on their products to add value for an aging technology.”
    The clincher there is you left of “pre” from the second word in that sentence.

    1. Olin Coles

      Did I offend an AMD fanboy with the truth? Only someone like that would go off on a rant without reading anything more than the first few paragraphs, and then selectively ignore the content. Since you didn’t make it past page one, here are the features you missed:
      NVIDIA G-SYNC (noted)
      NVIDIA ShadowPlay (mentioned in the same paragraph you quoted)
      NVIDIA Boost 2.0 (listed next)
      FXAA and TXAA post-processing
      NVIDIA 3D Vision
      Adaptive VSync
      PhysX technology

      Furthermore, please feel free to compare the months that NVIDIA and AMD have each been the leader in discreet graphics technology. You’ll see that NVIDIA offers the ‘most powerful’ video card 11 months for every 1 month (rounded up) that AMD has managed to do so. Facts… they’re so pesky.

      1. Caring1

        More like I offended an Nvidia fanboy.
        The features you mention are proprietary, AMD also has a large list of proprietary features, something you neglect to mention in your fervour and slathering to your favourite company.
        Funny how you always seem to include negative remarks about AMD, even when they hold no relevance to the comparison.

        1. Olin Coles

          I said “delivers a host of additional features not seen or available from the competition”, which you’ve just confirmed to be completely true by pointing out their proprietary nature. Also, and in much the same way as you ignorantly posted a rant without reading the article, you’ve also failed to notice how many AMD articles I’ve written… namely the recent R9 270X by MSI and Sapphire… both of which received my praise and awards.

          As I’ve mentioned several times before: I don’t care who makes the product. All I care about is who offers the best product, the best features, or the best value. It’s easy to post a ridiculous comment that cries foul when you ignore facts like benchmarks, temperatures, fan noise, features, etc. Obviously you’re blinded by your commitment to the AMD brand, as evidenced by your comment in defense of Radeon R9 290X against GeForce GTX 780 Ti. Regardless of how you want to twist things: AMD is still #2 in GPU performance just like they usually are.

  3. cobra32

    I read your article and it is bias no ultra hd 4k numbers and lower resolution for some games higher for others. The 290X is a god send it made Nvidia have to adjust there prices and that is more important then anything. If it was not for Amd you would be paying 1200 for that 780GTX TI. Amd gives us 780 GTX performance for 399.00 with R290 while Nvidia gave it to us for 649.00 with the GTX 780. The 290x is the future with some tweaking it will be the card of the future. With Ultra HD 4k on the horizon the extra memory will come into play and again Nvidia keeps sticking it to its customers by giving us 3 gb instead of 4gb or 6 gb like the titan which is what this card will need in 4k game play. That G-sync is a joke since none of the top Monitors provide it. I bet you ran the 290x in quiet mode. You did not even use the new drivers from AMD betas 9.2 also. Amd’s mantle will also improve the gaming experience and with all the game manufactures programing for the AMD chips since all the game systems use them now it will leave NVidia out in the cold. I’m not a Fan boy of either but will give credit will credit is due Nvidia Has given game players the shaft for a while with over priced video cards and we can thank AMD for giving NVidia a reality check. Thank you AMD for not fucking us like Nvidia has for along time with over priced video cards. I also own a over price EVGA classified 780 GTX card. You got once NVidia never again waiting for 290x Asus matrix that will be a hot video card.

    1. Steven Iglesias-Hearst

      The AMD R9 290X is not 780 GTX performance for 399.00.

      For starters the R9 290X has no overclocking headroom, but the GTX 780 has. NVidia had the GTX 780Ti months ago but didn’t need to release it till now since AMD are only just able to muster up some competition. You seem to forget that AMD recent releases were just more re-brands.

Leave a Reply

Your email address will not be published. Required fields are marked *

CAPTCHA Image

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>