• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: NVIDIA GeForce GTX 1660 Ti to feature 1536 CUDA cores

GoldenEye98

posts news as their odd job
https://videocardz.com/newz/rumor-nvidia-geforce-gtx-1660-ti-to-feature-1536-cuda-cores

The name is not yet fully confirmed, although we have heard it from two sources (three including the source of this picture).

The GeForce GTX 1660 Ti is to become NVIDIA’s first Turing-based card under GTX brand. Essentially, this card lacks ray tracing features of RTX series, which should (theoretically) result in a lower price.

New SKU features TU116 graphics processor and 1536 CUDA cores. This means that GTX 1660 Ti will undoubtedly be slower than RTX 2060.

On a contrary to some rumors, our sources claim that GTX 1660 Ti still features GDDR6 memory and a 192-bit bus. So far we have not heard anything about GDDR5(X) variants.
 

Azurro

Banned
I'm not the target audience for this (I don't do PC gaming), but damn, graphics card names are obscure as fuck.

I suppose this makes more sense than the RTX2060. Ray tracing just seems not ready for primetime when activating the feature halves the framerate.
 

888

Member
Wait. How did we go from GTX 10xx to RTX 20xx to GTX 16xx. This is almost as bad as Microsoft’s naming convention.
 

Redneckerz

Those long posts don't cover that red neck boy
I thought it was going to be called GTX 1160? Lenovo had a leak last year.
I'm not the target audience for this (I don't do PC gaming), but damn, graphics card names are obscure as fuck.

I suppose this makes more sense than the RTX2060. Ray tracing just seems not ready for primetime when activating the feature halves the framerate.
There is a valid reason why it has such a performance deficit - And its not because its not ready. You get playable framerates at 1080p (which is the majority of gamers res) with an 2060 right now, in BFV that is. BFV is not the best example to work with too.
 

Fbh

Member
For the next generation of cards they need to go full Square enix with the naming
"Hey guys I just upgraded to a GTX 1274.5/2 Dream chain cucumber shadow"
"Wow lucky you. I'm still rocking my GTX 1152.15/15² Darkness 25Cº dominon:code but I'll have to upgrade soon"
 

Azurro

Banned
I thought it was going to be called GTX 1160? Lenovo had a leak last year.

There is a valid reason why it has such a performance deficit - And its not because its not ready. You get playable framerates at 1080p (which is the majority of gamers res) with an 2060 right now, in BFV that is. BFV is not the best example to work with too.

Every single benchmark I see online shows this behaviour, an almost 50% decrease in framerate. There is of course a valid technical reason, but it's senseless for the consumer to get the RTX2060 card if turning on a single graphics option can tank the performance that much.

Better to wait for a new generation with better ray tracing performance. This is gen 1 and the performance is horrid. No matter the reason, cutting the framerate in half for a subtle addition is never acceptable.
 

CuNi

Member
Hopefully we'll get GTX versions of all cards. I'd love to get my 80ti as planned but no way I'm going to sell a kidney for it.
 

dirthead

Banned
Hopefully people just continue to not buy them and prices go down. It was getting out of control. $1,200 for a supposedly consumer level GPU. What a joke.
 

MetalRain

Member
I wonder what is the price point? $250? So something that beats 580 and 1060 but not quite vega 56 or 1070ti/2060?

Every single benchmark I see online shows this behaviour, an almost 50% decrease in framerate... Cutting the framerate in half for a subtle addition is never acceptable.
It wasn't so long time ago that HBAO was 10-30% FPS loss vs no AO. How much performance would you be willing to sacrifice for raytraced reflections?
 
Last edited:

llien

Member
Hopefully we'll get GTX versions of all cards. I'd love to get my 80ti as planned but no way I'm going to sell a kidney for it.
Next gen 7nm from both companies is expected within 1 year, with AMD rumored to go no offensive with $250 1080 perf level card.

Come on AMD. Give em hell! Force them to lower their prices.
Hopefully you were sarcastic. If not, this is why AMD didn't have money for proper R&D on both CPU and GPU fronts.

I wonder what is the price point? $250? So something that beats 580 and 1060 but not quite vega 56 or 1070ti/2060?
You mean 590? XFX Fatboy sells for 239 Euro here. Comes with 3 games.
 
Last edited:

Pagusas

Elden Member
I'm not the target audience for this (I don't do PC gaming), but damn, graphics card names are obscure as fuck.

I suppose this makes more sense than the RTX2060. Ray tracing just seems not ready for primetime when activating the feature halves the framerate.

The RTX cards are first to market cards, 1st gen hardware, anyone buying them knows they are basically beta testing this stuff for the future generations. It needed to launch now so devs start using it and it forces other companies (MS, Amd) to take interest and eventually make ray tracing a real thing in the market. It had to come out now to get that ball rolling, and we should all be gratefully the world is full of idiots with lots of money to buy these so we can get the real deals on 2 years.
 

Redneckerz

Those long posts don't cover that red neck boy
Every single benchmark I see online shows this behaviour, an almost 50% decrease in framerate. There is of course a valid technical reason, but it's senseless for the consumer to get the RTX2060 card if turning on a single graphics option can tank the performance that much.
We literally went from zero raytracing (since the deficit was too big) to actual, playable performance at 1080p60 within 1.5 years. Hell, BFV pre-patch would have tanked on the RTX 2060 at 1080p and barely hit 30 fps. Now we are already getting double the performance with raytracing on.

I am not sure you understand - Raytracing is expensive. A lot. This is why its only used in a hybrid fashion and not in full scene where everything is traced.

Better to wait for a new generation with better ray tracing performance. This is gen 1 and the performance is horrid. No matter the reason, cutting the framerate in half for a subtle addition is never acceptable.
There are two things majorly wrong with this:
  • Performance is not horrid. See above.
  • ''Subtle addition'' you say? I've already stated that BFV is not the best example, but perhaps Atomic Heart is a better one? Since that actually does the whole RTX package of raytraced reflections, refraction, and shadowing, leading to all kinds of visual effects.
 

GoldenEye98

posts news as their odd job
Apparently the name is not fully confirmed. Honestly I think they should just call it the GTX 2050 ti.
 
Last edited:

Azurro

Banned
We literally went from zero raytracing (since the deficit was too big) to actual, playable performance at 1080p60 within 1.5 years. Hell, BFV pre-patch would have tanked on the RTX 2060 at 1080p and barely hit 30 fps. Now we are already getting double the performance with raytracing on.

I am not sure you understand - Raytracing is expensive. A lot. This is why its only used in a hybrid fashion and not in full scene where everything is traced.


There are two things majorly wrong with this:
  • Performance is not horrid. See above.
  • ''Subtle addition'' you say? I've already stated that BFV is not the best example, but perhaps Atomic Heart is a better one? Since that actually does the whole RTX package of raytraced reflections, refraction, and shadowing, leading to all kinds of visual effects.

Lol, I'm a developer (not of games though, thankfully), of course I understand how expensive ray tracing is. However, I'm afraid you are the one that doesn't understand. A 1080Ti is capable of running games at 4K 60 FPS at higher settings than consoles. Turning on raytracing brings the performance down to 1080p at 60 fps after 2 rounds of optimisations in the implementation. Is it much improved? Sure. Is it impressive? Yes indeed, of course, it's some sort of ray tracing in real time, that's impressive.

However, for a consumer product, this is still a bit of a gimmick. A lot of the transistors of the card from what I understand are meant for ray tracing, and most of their use at the moment are for accurate reflections and refractions (rasterisation isn't going away any time soon, this is a hybrid approach). To gain that, all you need to do is reduce your resolution 4 times or tank your frame rate by half.

For a consumer product, this is a damn gimmick meant for over excited PC Master Race people, and I have to give it to NVidia, they are masters of hyping up their fanbase. It's still ridiculous performance and anyone sane should wait for a second generation product for a better implementation.
 
Last edited:

Redneckerz

Those long posts don't cover that red neck boy
Lol, I'm a developer (not of games though, thankfully), of course I understand how expensive ray tracing is.
Do you?

However, I'm afraid you are the one that doesn't understand.
Humor me. What am i not understanding, exactly? You can already get RT through CUDA or GCN cores anyway.

A 1080Ti is capable of running games at 4K 60 FPS at higher settings than consoles. Turning on raytracing brings the performance down to 1080p at 60 fps after 2 rounds of optimisations in the implementation. Is it much improved? Sure. Is it impressive? Yes indeed, of course, it's some sort of ray tracing in real time, that's impressive.
Well that's the point really, and we are talking RTX 2060 here, not GTX 1080 Ti. It has no RT cores meaning all raytracing has to be done by brute forcing since i doubt DXR is in play. But even a RTX-less card can pump out fascinating raytracing as the demoscene shows.

However, that is still using stream processor cores that are built for rasterization, not raytrace acceleration.

However, for a consumer product, this is still a bit of a gimmick. A lot of the transistors of the card from what I understand are meant for ray tracing, and most of their use at the moment are for accurate reflections and refractions (rasterisation isn't going away any time soon, this is a hybrid approach). To gain that, all you need to do is reduce your resolution 4 times or tank your frame rate by half.
And still get playable framerates where certain effects are better, meaning its already viable today. Thanks for the reminder.
 

kiphalfton

Member
Seems underwhelming. Not sure what the point of releasing it is, since it's probably gonna be about as powerful as the GTX 1070... but with less vram. It would be a pretty lukewarm deal at $250 (but I'm sure that will be the MSRP and aib cards will be upwards of $300).
 

Leonidas

Member
$250-$300 for near GTX 1070 performance would make this the most powerful card and probably the best value in the sub $300 price range. Should be solid for those who don't want to pay the RTX premium.

1660 Ti is probably a throwback to the 660 Ti. I like the naming.
 

mumbo's brain

Neo Member
I think they have too many options but I guess, if the card has some definitive performance difference compared to GTX 1080 etc maybe then. I don't know. Are they already admitting the ray tracing is just a fad because everything can still be done with nice shaders anyway? Pixar managed to do without ray tracing for quite long time (they only used ray tracing in specific shots where it was specifically needed for better reflections and so on). Of course mentioning Pixar at this point is obsolete on many levels. I dunno. What I would really like to see is real time, large scale volumetric effects in games and maybe RTX will bring this stuff forward much more.
 
Last edited:

Azurro

Banned
Do you?


Humor me. What am i not understanding, exactly? You can already get RT through CUDA or GCN cores anyway.


Well that's the point really, and we are talking RTX 2060 here, not GTX 1080 Ti. It has no RT cores meaning all raytracing has to be done by brute forcing since i doubt DXR is in play. But even a RTX-less card can pump out fascinating raytracing as the demoscene shows.

However, that is still using stream processor cores that are built for rasterization, not raytrace acceleration.


And still get playable framerates where certain effects are better, meaning its already viable today. Thanks for the reminder.

Ah, is this a member of the Nvidia fanboy species? Come on kid, this is not a lab, this is a consumer product. Drop the fanboyism, it's not the customer's job to care how complex the algorithm or the calculations are.

The fact is, this heavily optimised algorithm backed by specialised hardware drops the framerate by half or brings the resolution to its knees for a few subtle effects. Because of course, it's so expensive that the only thing it can do is more accurate reflections and refractions at the moment.

Nvidia is marketing a marquee feature, that in real terms when you turn it on, it results in subtle improvements and shit performance for any of the RTX20XX cards.

It's not ready for primetime, it's a gimmick at the moment.
 
Last edited:

Leonidas

Member
Are they already admitting the ray tracing is just a fad

Ray-tracing is not a fad. It's just not feasable to bring RTX to low end cards this generation. AMD is working on hybrid rendering GPUs right now as well but they just aren't ready yet. Wouldn't be surprised to see an RTX competitor from AMD in ~2 years.
 
Last edited:

Redneckerz

Those long posts don't cover that red neck boy
Ah, is this a member of the Nvidia fanboy species?
Compelling question. Perhaps you should ask more, then you would find out that i enjoy graphics in general, and love AMD's more unique products coming out.

But if you want confirmation: My rusty old PC has a integrated Geforce 6150. Shit, i guess that seals my fate then.

Come on kid, this is not a lab, this is a consumer product. Drop the fanboyism, it's not the customer's job to care how complex the algorithm or the calculations are.
Evidently that's why its dismissed and i blame BFV for it since it really does not show it in the right light (heh). That's on Nvidia though, rushing the product to market when the only game that could support it in time is basically one of the worser titles to pick out of. I reckon Metro would be a better example.

(Im sounding like a real Green Team fanboy here now am i?)

The fact is, this heavily optimised algorithm backed by specialised hardware drops the framerate by half or brings the resolution to its knees for a few subtle effects.
''Brings the resolution to its knees'' - RTX2060 doing RT on medium at 1080p60. 1080p is the majority resolution for Steam Gamers, so its definitely in a playable state. I am not sure from what planet you come from when you think 60 fps on a card that is ought to be played at 1080p (with RT on) and 1440p (With RT off) predominantly anyways is shit performance in your book.

Heck, it can even do 4K if you want it to and still get playable framerates...

Ray-tracing is not a fad. It's just not feasable to bring RTX to low end cards this generation. AMD is working on hybrid rendering GPUs right now as well but they just aren't ready yet. Wouldn't be surprised to see an RTX competitor from AMD in ~2 years.
Exactly, which is why RTX 2060 is likely to be the base model. Dropping core counts even lower and RT performance adds too little to make it work at 1080p.
 

llien

Member
Performance is not horrid.

Look at what actually is done behind the scenes, actual result of one ray per pixel (where RTX stuff currently is at), RT is used only for shadows:

Vzd3grC.png



Denoising applied to the scene above:

laHA5wU.png
 

Redneckerz

Those long posts don't cover that red neck boy
Look at what actually is done behind the scenes, actual result of one ray per pixel (where RTX stuff currently is at), RT is used only for shadows:

Vzd3grC.png



Denoising applied to the scene above:

laHA5wU.png
Its 1080p60 on RTX 2060 on RT Medium. Yes it is hybrid rendering. Yes it is limited compared to offline raytracers and other solutions as it only does a subset of what full raytracing does, that does not mean performance is horrid.
 

llien

Member
it only does a subset of what full raytracing does
Ahem.
It only does shadow on that pic.
That is using 1 ray per pixel (the only levels within reach for RTX).
And this is how those RTed shadows look, unless denoised:

Vzd3grC.png


There is an example of full frame RT, well, perhaps not that far away, even 2spp doesn't look that bad to me:

 
Last edited:

Redneckerz

Those long posts don't cover that red neck boy
Ahem.
It only does shadow on that pic.
I am talking about RTX in general. Even the most feature rich game coming out, Atomic Heart, doing RT reflections/refractions/shadows, its just a subset of the possibilities RT has.
 

llien

Member
I am talking about RTX in general. Even the most feature rich game coming out, Atomic Heart, doing RT reflections/refractions/shadows, its just a subset of the possibilities RT has.
I'd urge you to watch the video I've linked fro the very point it's linked. It shows where we are with 50 paths per pixel. (not arguing about anything in particular, just find it interesting)
nVidia is at 1 ray per pixel at the moment.
 
Last edited:

Redneckerz

Those long posts don't cover that red neck boy
I'd urge you to watch the video I've linked fro the very point it's linked. It shows where we are with 50 paths per pixel. (not arguing about anything in particular, just find it interesting)
nVidia is at 1 ray per pixel at the moment.
I am aware of it. I am also aware of what you are saying. Ill also watch that video.

I am not exactly sure why you take note of the phrasing that Nvidia is only using a subset of RT - your responses actually support this statement.
 

llien

Member
I am not exactly sure why you take note of the phrasing that Nvidia is only using a subset of RT - your responses actually support this statement.
Because while it is what RT (or rather path tracing, as Pixar's experience shows) can do, current gen hardware is far away from doing.
 

Redneckerz

Those long posts don't cover that red neck boy
Because while it is what RT (or rather path tracing, as Pixar's experience shows) can do, current gen hardware is far away from doing.
Your sentence is not flowing, so i am not sure if i understand this correctly:
You are saying that because whilst it is RT, it is still far away from Pixar levels of rendering?

Because i mean... Yeah? I never contended otherwise?
 

Kenpachii

Member
Should just call it 2050 so people atleast understand the naming. This is just confusing for everybody.
 
Last edited:

ethomaz

Banned
Apparently the name is not fully confirmed. Honestly I think they should just call it the GTX 2050 ti.
If it is based in Pascal then it should be GTX 1xxx.

1660 is not a bad name at all... shows it is a refresh for Pascal cards that didn’t have ray-tracing features of the new Arch.

Should just call it 2050 so people atleast understand the naming. This is just confusing for everybody.
2050 is more confusing because it lacks features of 2000 family.

It should be in 1000 family.
 
Last edited:

bati

Member
I just hope some new info comes out soon, I'm about 3 weeks away from buying a new card and so far 2060 looks like the best option, even if raytracing is kinda wasted on xx60 series.

Wouldn't mind getting a card for 100€ less at same performance minus a gimmick feature that I won't even use.
 

GoldenEye98

posts news as their odd job
If it is based in Pascal then it should be GTX 1xxx.

1660 is not a bad name at all... shows it is a refresh for Pascal cards that didn’t have ray-tracing features of the new Arch.


2050 is more confusing because it lacks features of 2000 family.

It should be in 1000 family.

Except it's not Pascal....it's a Turing card without the ray-tracing features of the RTX cards.....hence why I suggested it would make more sense to call it a GTX 2050 ti.
 

kraspkibble

Permabanned.
whoever thought calling it a 1660 Ti was a good idea needs fired. it should be:

GTX 2050
GTX 2060
RTX 2060 Ti
RTX 2070
RTX 2080
RTX 2080 Ti

or even better:

GTX 2030
GTX 2050
RTX 2060
RTX 2070
RTX 2080
RTX 2080 Ti
 
Last edited:
Top Bottom