NVIDIA GeForce GTX 780 Ti Specs Leaked Already?

Last week AMD tried to crash NVIDIA’s press event by showing how their upcoming Radeon R9 290X video card was able to best the NVIDIA GeForce GTX 780 on 4K monitor on a pair of game titles. NVIDIA was able to one up that by announcing the GeForce GTX 780 Ti video card would be coming in November and become the companies flagship new video card. It sounds like this card could be able to trump AMD’s Radeon R9 290X, but no details were given at the event on the cards hardware specifications. It is safe to assume that the GeForce GTX 780 Ti will be using the GK110 ‘Kepler’ GPU and will need to fill the rather small performance gap between the GeForce GTX Titan and the GeForce GTX 780.

NVIDIA GeForce GTX 780 TI Video Card

We’ve been hearing rumors about a GeForce GTX Titan Lite that would have 2496 CUDA cores and 5GB of GDDR5 since March 2013, so could this be the GeForce GTX 780 Ti?

gtx780-gpuz

The current NVIDIA GeForce GTX 780 has 2304 CUDA cores (stream processors) with 192 texture units and 28 ROPs. The NVIDIA GeForce GTX 780 reference card (shown above) has a core clock speed of 863MHz (902MHz Boost) and 3GB of GDDR5 memory running at 1502MHz (6008MHz effective).

NVIDIA GeForce GTX 780 Ti GPU-Z Shot

According to the leaked GPU-Z shot, the GeForce GTX 780 Ti features the GK110 GPU and has 2496 CUDA Cores, 208 TMUs and 48 ROPs. It also has a core clock speed of 902MHz (954MHz boost) and 3GB of GDDR5 memory at 1502MHz (6008MHz effective). GPU-Z details show the memory bandwidth remaining at 288.4 GB/s, but the Pixel Fillrate went up from 41.4 GPixels/s to 43.9 GPixels/s and the Texture Fillrate went from 165.7 GTexel/s to 205.0 GTexel/s! This is a 6% increase on the Pixel Fillrate and a massive 23.7% increase on the Texture fillrate. This image could be totally fake, but it also could be real.

AMD and NVIDIA are locked in a pretty fierce battle over these high-end cards and we can’t wait to see what the real specs are and who comes out on top! We talked about how the move to 4K is being heavily marketed by both AMD and NVIDIA last week and the AMD Radeon R9 290X and the NVIDIA GeForce GTX 780 Ti appear to be the cards to have for a single-GPU 4K setup.

Print
  • trajan2448

    2880 cores. beats Titan and overclocks to 1240 core. Its a beast.

  • FatAmerica

    i cannot believe its still on a 384 bit memory interface. just like the titan. the R9 series is going to destory nvidia or make nvidia rush to the 800 series.

    • Derek Despard

      That was already confirmed, GTX 800 series is to be released 1st quarter 2014 with maxwell architecture. Which is “in theory” Twice the card that the current gen is.

      • David Calloway

        I think 1st quarter 2014 arch. shrink is a bit premature at this point…. and when was the last time we saw the next gen. give us twice the card?

  • Derek Despard

    tested on PCI-e 2.0? fake…

    • Jonathan

      I’m not sure if Nvidia has officially supported PCI-e 3.0 on the 700 series, have they?

      • Derek Despard

        Since the GTX 600 series they have.

        • Jonathan

          Oh nevermind, I was thinking of the X79 chipset.

  • victor

    What is the use of 5GB or 6Gb RAM, when most of the games don’t even use 2GB RAM? Just works as a stunt to attract the masses. 4GB RAM is more than future proof.

    • basroil

      Battlefield 4 uses at least 2.5GB in Ultra settings on 1080p. When you jump to 4K screens (which is what the 780Ti was intended for according to nvidia), BF4 uses a whopping 5.1GB of ram! Yes, you need that 6gb if you want to be “future proof”

      • victor

        Again the same question 4K monitors used by less than .01 % of the people. I am speaking for the remaining buyers. 3 monitors are used by many, but even for them 4GB is more than enough for the next 3 years. Don’t speak about future proof when 70-80% of the high end market upgrades every 3 years. Can you show any game benchmark that used more than 3Gb of RAM now.

        • Damo

          Victor – you’re the one who said that the 4GB of ram *is* future proof and now you’re saying “don’t speak about future proof”. The truth is that there isn’t a single card out there right now that’s future proof – look at the SLI benchmarks for this card at 4k they barely cut it. These are high end cards so you have to include high end monitors in the mix! 0.01% now will be 2 or 3% in a year. So, this is a beast of a card, it’s probably killing everything at 1440p right now and beats the Titan in most benchmarks. But it’s not future proof.

    • ShaneMcGrath

      Are you kidding, Ever heard of mods. Also multi monitor setups, 3d etc etc
      Skyrim with hd textures and a crap load of mods easily goes past 2gb already, No such thing as future proof in pc land buddy.

      • victor

        How many buying a Titan is going to keep their PC for more than 3 years? 30%. Remaining 70% will upgrade every 3 years, these people buying high-end Titan always want to stay update. So for 3 years 4GB is plenty. How many will buy a 4k display at the current price? It will take a minimum of 4 years to recue the price to be considered by the general mass. Skyrim with all the mods goes past 2 GB in Ultra, that’s why I said 4GB is plenty for now.

  • godrilla

    Hopefully Maxwell vanilla flagship will have more ram standard!

  • Guest

    next they will be selling including the fusion powered board to power the damn thing

    • basroil

      It’s not AMDso no worries. i7 4770 and Geforce TITAN together are under 400W peak even with mobo, memory, and disks included. AMD FX9350+7990 is well over that, so I can see why you worry

      • YYZ

        So many people pissing and moaning about power usage with high end components is so ridiculous, it’s like in many ways complaining about how much fuel a top fuel dragster uses, nobody “really” cares it’s just a useless bitch. I for one when upgrading only check the wattage to make sure my PS can run it but to make a final purchase decision because it moderately uses less power is so silly. $ per FPS which the 290x excels at is the only calculation that makes my mind up on buying or not. Go turn a few lights off.

  • BpK

    Kind of dumb if this is real. There is no “gap” between the 780 and Titan. At least not one worth filling.

    Now if they’re simply doing this to allow them to release a very high-end card around the $500 mark…then that’s a totally different story.

    • Jonathan

      It’s not really dumb, I’d say. Honestly, Titan isn’t even meant for gaming. The 6GB of VRAM and the unhobbled double precision makes it a compute/workstation card. The GTX 780 and presumably 780 Ti will both have hobbled double precision compute capabilities. They’re focused on gaming, and perhaps have better overclocking headroom with the DP units disabled/hobbled.

      Plus with as mature as he TSMC 28 nm process is becoming, yields should be a lot better than they initially were. Which gives Nvidia the ability to release a gaming oriented 14 SMX part (currently the Titan) and even a full 15 SMX enabled (better than Titan) part, for gaming, computer, or both.

      Perhaps it’ll depend on how well the R9-290X does. As it looks, it probably won’t beat Titan and unless the overclocking headroom is nice, will be somewhat better than a 780 and on par or nearly so with a 780 Ti. Nvidia can drop the 780 price, put the 780 Ti at the same price bracket as the R9-290X, and they get another full year to rest on their laurels, until Maxwell is released.

      • Nathan Kirsch

        “Nvidia can drop the 780 price, put the 780 Ti at the same price bracket
        as the R9-290X, and they get another full year to rest on their laurels,
        until Maxwell is released.”

        Sounds highly likely that this is what will happen

        • BpK

          The 780ti better be nipping at the heels of the Titan…as in it wins some and loses some in a heads-up round of benchmarking.

        • Jonathan

          As for the reasons I wrote above, the 780 Ti could even beat the Titan as far as Nvidia really needs to be concerned. Titan was never meant to be the best gaming card. And even if the 780 Ti is better than it in games, anyone who’s serious about compute would go for the Titan hands down.

          But that’s also assuming Nvidia doesn’t do a Titan Ti or something. With the full 15 SMXs. Or even allow their card partners to unlock voltages to let the cards really shine in overclocking.

      • BpK

        The initial benchmarks on the 290X blow the 780 away (more than 20% better on average) – and that’s with early drivers…and it’s speculated to be priced around $650. So you think the 780ti will be as good as a 290X and priced at $650?

        So the standard 780 ends up being priced where? $500?

        • Jonathan

          I’m still waiting for official benchmarks. Everything so far (at least that I’ve seen) has been rumored, and the most recent ones put the lead above the 780 as very minor. And the 780 has some good overclocking headroom and probably beats it in power consumption.

        • Phil Deakin

          The 290x is ridiculously power hungry, in the recently released benchmarks (today’s) it used the most power of any card tested. And according to the results it is not the “titan killer” that everyone thought it was. Lets hope the 780 Ti does better.

        • Jonathan

          And also, what with the probably yields this late into the game and how overpriced the 780 was to begin with, they could easily drop them to $500 or even lower and shift the 770 down to better compete with the 280X (the rebranded 7970 that it is… Ugh). I’m still hoping the 290X comes in below $600 because at it’s rumored performance level and as late to the game as it is, asking people to pay $650 for it would be stupid. Nvidia already has a year head start or more on AMD if the rumored 290X performance is accurate.

          The excitement with it competing with an architecture and silicon that’s well over a year old, and closer to two at this point, is saddening. Goes to show how the GPU industry is slowing and stagnating, just as the CPU industry is.

        • gon

          GPU and CPU industries are just coming back to normal speeds, people just got used to a few gigantic leaps we had and are forgetting the times where industry got stagnant for a while. Intel with Pentium 4 era which was not a huge upgrade over P3, stagnant until Core 2 came along, AMD with the Athlon not really moving forward with the first Phenom, Nvidia with the FX 5×00 series, then with the neverending rebranding of the 8800, AMD with the 2900XT and now the 7×00 series.

          And that’s only for the “recent years”.

    • Roman Cruz

      It’s not so much filling a gap as it is minimizing the damage done by the 290x. It’s just like what AMD did, when the GTX 680 was released: they dropped price on the 7970, released an overclocked version, and put that in the old price point, in an effort to mitigate losses.

  • Jonathan

    Looks at least highly plausible. All the talk of 4 GB of VRAM or just a highly overclocked 780 was illogical. There’s always a different core count between different GPU tiers, including the normal and Ti versions. The 780 has 2-3 SMXs disabled compared to the Titan, so it was only logical that Nvidia would add one of those disabled SMXs to create a card that fits in between them.

    Which still leaves them room for something above Titan, as they haven’t yet released a consumer card with all 15 SMXs enabled.

    • techsgt311

      Am I the only one who looked at the “DX” count? 11.0….NOT 11.1, or higher, yet it is a GK110, and supposedly a gap filler for the GTX780, and the Titan….Any one ever hear of ‘PhotoShop’?

      • Jonathan

        Titan and the 780 are only DirectX 11 compatible. We’ll likely have to wait for the next generation of Nvidia cards to get built in DX 11.2 support, as these are basically refreshes of the same architecture. It’s all GK110, just with various SMXs disabled.

        • Curzon Dax

          What most people don’t know is that DX support is not always hardware support (though it should be). They can do whatever they want with support on drivers, i.e. software. Sometimes it reflects the hardware, sometimes it could reflect it but they don’t release it to have a selling point for the next cards.

      • basroil

        Fermi is missing a few bearly used parts of DX11.1 so the entire path is disabled. Sad, but a known fact.