Nvidia Tesla V100 Volta Based 12nm FinFET GPU announced – No GTX 2080 discussed as of Now

You are currently viewing Nvidia Tesla V100 Volta Based 12nm FinFET GPU announced – No GTX 2080 discussed as of Now

GTC has been great in 2017. Nvidia’s panel showed us the amazing potential of their upcoming Volta based Tesla V100 GPU with the newest Tensor cores with compute performance of 960 TeraFlops per 8 CPU bundle which will set you back a modest $149,000.

The beast has 5120 cores and a transistor count of 21 Billion per chip and is approaching the edge of what is possible with our current industrial process. You can add so many transistors to a die until you run into a slew of troubles on the quantum scale, one of which is quantum tunneling.

We got to know a lot of Nvidia’s plan for the coming months and we now know that the server grade GPU -Tesla V100- will be available in Q3 – Q4. While there isn’t any news about the consumer models, it’s safe to say it’s an impressive leap in performance as compared to Pascal.

While some were awaiting a consumer line of graphics cards to be announced, the GTX 2080, GTX 2070, and the GTX 2060 we got a chunk of amazing technologies demoed, and showcase of the potential of the Volta powered Tesla V100.

What do we expect from the GTX 2080 and the expected release date? Well, it’s a massive iffy. We are unsure of what performance it will bring. But if we take into account the improvement for the server aspect of the GPU, it’s phenomenal improvement, and we may see a shadow of that tinkling into the consumer versions as well.

While VEGA is also on the verge of release, Volta can be a serious competitor to AMD’s upcoming lineup.

We’re probably going to see the Volta release in early-mid 2018 but as mentioned earlier, Tesla v100 will be shipped out in Q3-Q4. What do you think? What’s the expected performance from the consumer cards? Will VEGA be crowned the champion in gaming or will Nvidia step up their DirectX 12 game with VOLTA and prove themselves the champs?

For more updates, stay tuned.