«

»

NVIDIA GeForce GTX 780 Ti Video Card Review

PAGE INDEX

<< PREVIOUS            NEXT >>

VGA Power Consumption

In this section, PCI-Express graphics cards are isolated for idle and loaded electrical power consumption. In our power consumption tests, Benchmark Reviews utilizes an 80-PLUS GOLD certified OCZ Z-Series Gold 850W PSU, model OCZZ850. This power supply unit has been tested to provide over 90% typical efficiency by Chroma System Solutions. To measure isolated video card power consumption, Benchmark Reviews uses the Kill-A-Watt EZ (model P4460) power meter made by P3 International. In this particular test, all power consumption results were verified with a second power meter for accuracy.

The power consumption statistics discussed in this section are absolute maximum values, and may not represent real-world power consumption created by video games or graphics applications.

A baseline measurement is taken without any video card installed on our test computer system, which is allowed to boot into Windows 7 and rest idle at the login screen before power consumption is recorded. Once the baseline reading has been taken, the graphics card is installed and the system is again booted into Windows and left idle at the login screen before taking the idle reading. Loaded power consumption reading is taken with the video card running a stress test using graphics test #4 on 3DMark11 for real-world results, and again using FurMark for maximum consumption values.

This section discusses power consumption for the NVIDIA GeForce GTX 780 Ti video card, which operates at reference clock speeds. Our power consumption results are not representative of the entire GTX 780 Ti-series product family, which may feature a modified design or include factory overclocking by some partners. GeForce GTX 780 Ti requires an 8-pin and 6-pin PCI-E power connections for normal operation, and will not activate the display unless proper power has been supplied. NVIDIA recommends a 600W power supply unit for stable operation with one GeForce GTX 780 Ti video card.

NVIDIA-GeForce-GTX-780Ti-Video-Card-Angle-PCB

Measured at the lowest reading, GeForce GTX 780 Ti consumed a mere 12W at idle. NVIDIA’s average TDP is specified as 250W, however our real-world stress tests using 3D Mark Vantage caused this video card to consume 295 watts. Using FurMark’s torture test to draw maximum power, GeForce GTX 780 Ti increased its consumption up to 320 watts… which is modest compared to 380W for R9 290X.

These results position the GTX 780 Ti among the least power-hungry top-end video cards we’ve tested under load, but much more impressive is that it’s achieved by a flagship GTX-series product. If you’re familiar with electronics, it will come as no surprise that less power consumption equals less heat output as evidenced by our thermal results below…

GeForce GTX 780 Ti Temperatures

This section reports our temperature results subjecting the video card to maximum load conditions. During each test a 20°C ambient room temperature is maintained from start to finish, as measured by digital temperature sensors located outside the computer system. GPU-Z is used to measure the temperature at idle as reported by the GPU, and also under load.

Using a modified version of FurMark’s “Torture Test” to generate maximum thermal load, peak GPU temperature is recorded in high-power 3D mode. FurMark does two things extremely well: drives the thermal output of any graphics processor much higher than any video games realistically could, and it does so with consistency every time. Furmark works great for testing the stability of a GPU as the temperature rises to the highest possible output.

The temperatures illustrated below are absolute maximum values, and do not represent real-world temperatures created by video games or graphics applications:

Video Card Ambient Idle Temp Loaded Temp Max Noise
ATI Radeon HD 5850 20°C 39°C 73°C 7/10
NVIDIA GeForce GTX 460 20°C 26°C 65°C 4/10
AMD Radeon HD 6850 20°C 42°C 77°C 7/10
AMD Radeon HD 6870 20°C 39°C 74°C 6/10
ATI Radeon HD 5870 20°C 33°C 78°C 7/10
NVIDIA GeForce GTX 560 Ti 20°C 27°C 78°C 5/10
NVIDIA GeForce GTX 570 20°C 32°C 82°C 7/10
ATI Radeon HD 6970 20°C 35°C 81°C 6/10
NVIDIA GeForce GTX 580 20°C 32°C 70°C 6/10
NVIDIA GeForce GTX 590 20°C 33°C 77°C 6/10
AMD Radeon HD 6990 20°C 40°C 84°C 8/10
NVIDIA GeForce GTX 650 Ti BOOST 20°C 26°C 73°C 4/10
NVIDIA GeForce GTX 650 Ti 20°C 26°C 62°C 3/10
NVIDIA GeForce GTX 670 20°C 26°C 71°C 3/10
NVIDIA GeForce GTX 680 20°C 26°C 75°C 3/10
NVIDIA GeForce GTX 690 20°C 30°C 81°C 4/10
NVIDIA GeForce GTX 780 20°C 28°C 80°C 3/10
Sapphire Radeon R9 270X Vapor-X 20°C 26°C 68°C 4/10
MSI Radeon R9 290X 20°C 34°C 95°C 8/10
NVIDIA GeForce GTX 780 Ti 20°C 31°C 82°C 3/10

As we’ve mentioned on the pages leading up to this section, NVIDIA’s Kepler architecture yields a much more efficient operating GPU compared to previous designs. This becomes evident by the low idle temperature, and translates into modest full-load temperatures. While NVIDIA’s reference design works exceptionally well at cooling the GK110 GPU on GeForce GTX 780 Ti, consumers should expect add-in card partners to advertise unnecessarily excessive over-cooled versions for an extra premium. 82°C after ten minutes at 100% load using Furmark’s Torture Test is nothing at all, and is nowhere close to this card’s 95°C thermal threshold (which the R9 290X could actually reach).


SKIP TO PAGE:

<< PREVIOUS            NEXT >>

7 comments

Skip to comment form

  1. CrazyElf

    It’s competitive from a price performance standpoint (a first I suppose for high end GPUs).

    Hmm … it looks like:
    – It beats the Titan handily (unless the 3gb of VRAM runs out)
    – At lower resolutions, and more important at 2560×1440 it beats the 290X

    I wonder though how it will do against the 290X Crossfired at 4K if the 780Ti is in SLI?

  2. Caring1

    I’ve only read the first page and already it reads like a spiel from the Nvidia marketing division.
    I would have expected a more professional, independent approach to the review, but it seems their is a bias towards Nvidia products here.

    “delivers a host of additional features not seen or available from the competition. Ultra HD 4K resolution displays are supported, and so is the cutting-edge G-SYNC technology”

    So where is this host of features? Ultra HD is supported on the new AMD card, G-Sync is a Nvidia product not applicable to AMD cards. That makes one feature so far…
    Can we please try to be professional when reviewing?

    This type of B.S. isn’t needed or necessarily true: “NVIDIA tends to dominate the field when it comes to graphics processing power, leaving AMD scrambling to remain competitive by reducing prices on their products to add value for an aging technology.”
    The clincher there is you left of “pre” from the second word in that sentence.

    1. Olin Coles

      Did I offend an AMD fanboy with the truth? Only someone like that would go off on a rant without reading anything more than the first few paragraphs, and then selectively ignore the content. Since you didn’t make it past page one, here are the features you missed:
      NVIDIA G-SYNC (noted)
      NVIDIA ShadowPlay (mentioned in the same paragraph you quoted)
      NVIDIA Boost 2.0 (listed next)
      FXAA and TXAA post-processing
      NVIDIA 3D Vision
      Adaptive VSync
      PhysX technology

      Furthermore, please feel free to compare the months that NVIDIA and AMD have each been the leader in discreet graphics technology. You’ll see that NVIDIA offers the ‘most powerful’ video card 11 months for every 1 month (rounded up) that AMD has managed to do so. Facts… they’re so pesky.

      1. Caring1

        More like I offended an Nvidia fanboy.
        The features you mention are proprietary, AMD also has a large list of proprietary features, something you neglect to mention in your fervour and slathering to your favourite company.
        Funny how you always seem to include negative remarks about AMD, even when they hold no relevance to the comparison.

        1. Olin Coles

          I said “delivers a host of additional features not seen or available from the competition”, which you’ve just confirmed to be completely true by pointing out their proprietary nature. Also, and in much the same way as you ignorantly posted a rant without reading the article, you’ve also failed to notice how many AMD articles I’ve written… namely the recent R9 270X by MSI and Sapphire… both of which received my praise and awards.

          As I’ve mentioned several times before: I don’t care who makes the product. All I care about is who offers the best product, the best features, or the best value. It’s easy to post a ridiculous comment that cries foul when you ignore facts like benchmarks, temperatures, fan noise, features, etc. Obviously you’re blinded by your commitment to the AMD brand, as evidenced by your comment in defense of Radeon R9 290X against GeForce GTX 780 Ti. Regardless of how you want to twist things: AMD is still #2 in GPU performance just like they usually are.

  3. cobra32

    I read your article and it is bias no ultra hd 4k numbers and lower resolution for some games higher for others. The 290X is a god send it made Nvidia have to adjust there prices and that is more important then anything. If it was not for Amd you would be paying 1200 for that 780GTX TI. Amd gives us 780 GTX performance for 399.00 with R290 while Nvidia gave it to us for 649.00 with the GTX 780. The 290x is the future with some tweaking it will be the card of the future. With Ultra HD 4k on the horizon the extra memory will come into play and again Nvidia keeps sticking it to its customers by giving us 3 gb instead of 4gb or 6 gb like the titan which is what this card will need in 4k game play. That G-sync is a joke since none of the top Monitors provide it. I bet you ran the 290x in quiet mode. You did not even use the new drivers from AMD betas 9.2 also. Amd’s mantle will also improve the gaming experience and with all the game manufactures programing for the AMD chips since all the game systems use them now it will leave NVidia out in the cold. I’m not a Fan boy of either but will give credit will credit is due Nvidia Has given game players the shaft for a while with over priced video cards and we can thank AMD for giving NVidia a reality check. Thank you AMD for not fucking us like Nvidia has for along time with over priced video cards. I also own a over price EVGA classified 780 GTX card. You got once NVidia never again waiting for 290x Asus matrix that will be a hot video card.

    1. Steven Iglesias-Hearst

      The AMD R9 290X is not 780 GTX performance for 399.00.

      For starters the R9 290X has no overclocking headroom, but the GTX 780 has. NVidia had the GTX 780Ti months ago but didn’t need to release it till now since AMD are only just able to muster up some competition. You seem to forget that AMD recent releases were just more re-brands.

Leave a Reply

Your email address will not be published. Required fields are marked *

CAPTCHA Image

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>