• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Epic: UE4 unveil this year, Samaritan on 1 card w/ low wattage, More [Updated]

Mik2121

Member
Hmm... something makes me think that the UE4 demo was this Epic Citadel demo but of course with much more advanced tech. There's lots of places to show off tessellation, cloth physics etc.

http://www.unrealengine.com/flash/

I'm 99% sure that wasn't it. It probably had more to do with the Samaritan than that one. Though they might have used the Unreal world to show it, or even Gears of War... 4 :p
 

artist

Banned
Straight from Nvidia, dont ask how I got it.

capturevxjwv.png
 

Corky

Nine out of ten orphans can't tell the difference.
^Yep @ Laboured.



How'd you get it? :)

Wow. One card is almost as loud as three? [face_cringe]

I wonder if that's right.

logarithmic scale, 5db difference -> x5~ quieter.

Also lol that's some bs advertisement one 680 is not equal to three 580s by any means...
 

Mandoric

Banned
dBA don't do add the sound, for example two 30 dBA noise in one point not equal to 60 dBA.

it's logarithmic, 30dB x 10 = 40dB. A doubling or halving is worth around 3dB, so by those numbers the 680 is a bit over 1/4 triple 580s, probably equal to a single 580.
 

i-Lo

Member
All of a sudden I feel the pull toward the green side and wonder why a team with brilliant engineers were teamed with an unfriendly and unyielding license management. I'm going to miss nVidia from the next console gen. Wonder if there is a consolation to that.

Can't wait to see the specs though.
 

iceatcs

Junior Member
it's logarithmic, 30dB x 10 = 40dB. A doubling or halving is worth around 3dB, so by those numbers the 680 is a bit over 1/4 triple 580s, probably equal to a single 580.
I know, that's why I said it won't do calculate in simple maths. Not mean liberally add the sound.

Isnt it that 3dB more is twice as noisy?

edit: Mandoric with the math :p

Look!!
Just don't ask me.

people here though one 580 is 51dBA/3
 

Corky

Nine out of ten orphans can't tell the difference.
bullshit. one 680 doesn't beat 2 580s let alone three, and we don't even know for sure how much faster a 680 is than a 580.

I think what they're trying to get across is :

" Hey look, in the past we needed 3 of these to run this demo, but now we only need this one!"

In other words : hey we've done tons of optimization and changed the settings and this one card manages to run this techdemo that.

Everything is pointing towards another 15-20% jump between 580 to 680, reminiscent of that between 480 and 580. Stock.
 
it's logarithmic, 30dB x 10 = 40dB. A doubling or halving is worth around 3dB, so by those numbers the 680 is a bit over 1/4 triple 580s, probably equal to a single 580.

Thanks for that. It would still seem though that since the 680 supposedly has a lower TDP that it wouldn't be roughly equal in dBA.
 
All of a sudden I feel the pull toward the green side and wonder why a team with brilliant engineers were teamed with an unfriendly and unyielding license management. I'm going to miss nVidia from the next console gen. Wonder if there is a consolation to that.

Nvidia's engineering imo has been behind AMD's for a while now, and that with likely much more money to play with.

It seems to have FINALLY turned around with Kepler, where they got rid of that stupid hot clock among other things. But I wouldn't give Nvidia too much credit for finally spitting out a good part after years of imo lackluster ones carried by their brand and marketing. With their budget advantages they should have been creaming AMD on the level Intel does for years now.
 

Jtrizzy

Member
I know this has been asked over and over, but do we know if this is the "top of the line" card as the 580 is now? (I'm not counting the 590 is is sort of a dual gpu or whatever)
 
bullshit. one 680 doesn't beat 2 580s let alone three, and we don't even know for sure how much faster a 680 is than a 580.

Too be fair, it should be about 2X 580 in raw flops. The rest as other said is optimization.

I know this has been asked over and over, but do we know if this is the "top of the line" card as the 580 is now? (I'm not counting the 590 is is sort of a dual gpu or whatever)

Supposedly the top of the line card is tentatively scheduled for August. Rumored at 2300 SP's, so should be ~50% faster than 680 by raw specs.

Who even knows what the price will be, if 680 is 549. Personally I expect Nvidia will lower 680 to 399 or something at that time though creating more room.
 

artist

Banned
Nvidia's engineering imo has been behind AMD's for a while now, and that with likely much more money to play with.
I think thats an over simplistic generalization. The design goals at Walsh Av and Commerce Valley are not the same.

All of a sudden I feel the pull toward the green side and wonder why a team with brilliant engineers were teamed with an unfriendly and unyielding license management. I'm going to miss nVidia from the next console gen. Wonder if there is a consolation to that.

Can't wait to see the specs though.
That is just because the culture at Nvidia is different, a bit more aggressive.
 

Jtrizzy

Member
Thanks for the info. So August for the 580 replacement? I wasn't wanting to wait that long, but I don't want to half ass it or end up buying multiple cards in the same gen. Does that make this the equivalent of a 570 or a 560ti? I guess Nvidia doesn't like to classify them in this manner so the first cards will sell.
 

SapientWolf

Trucker Sexologist
Epic optimizing the demo for a year, probably with lots of help from Nvidia. It helps to sell the new cards.
Didn't they drop res from 1600p to 1080p as well? Losing 2 million pixels should help boost the framerates a bit. They also got rid of the super sampling edge anti aliasing.
 

Jtrizzy

Member
Didn't they drop res from 1600p to 1080p as well? Losing 2 million pixels should help boost the framerates a bit. They also got rid of the super sampling edge anti aliasing.

That is what I thought too, but I don't have a link yet. I know they switched to FXAA from (I'm assuming) MSAA. But I also think there was a drop in resolution too.
 

Globox_82

Banned
I think when they show this UE4 demo it will be for some nextbox even, maybe a few days before WiiU get's released to "kill" the hype or something like that I would do.
 

Jtrizzy

Member
Just speculation, but I think we'll see it soon. I think Reign was referring to Nvidia when he mentioned it no being up to him when the demo was shown. They can show it off on the new cards, pushing Sony/MS to include the specs they want for next gen.
 

artist

Banned
Why are we arguing over FXAA when TXAA will make it useless down the road? Unreal 4 will also be one of the major titles to adopt it.
 

Blizzard

Banned
Why are we arguing over FXAA when TXAA will make it useless down the road? Unreal 4 will also be one of the major titles to adopt it.
Okay I'm apparently confused, so I have questions:

1. Is TXAA temporal anti-aliasing? Don't people hate that? People complained a bunch when Crysis 2 used it, if I recall correctly. Maybe it would be okay if ghosting did not happen, but won't it still blur things?

2. When was Unreal 4 announced? I thought there was just Unreal Engine 4, and not an Unreal "title".
 

artist

Banned
Okay I'm apparently confused, so I have questions:

1. Is TXAA temporal anti-aliasing? Don't people hate that? People complained a bunch when Crysis 2 used it, if I recall correctly. Maybe it would be okay if ghosting did not happen, but won't it still blur things?

2. When was Unreal 4 announced? I thought there was just Unreal Engine 4, and not an Unreal "title".
1. One of the modes of TXAA is temporal.
2. It wasnt, I meant to say UE4.0 will use TXAA, not a title.
 
Why are we arguing over FXAA when TXAA will make it useless down the road? Unreal 4 will also be one of the major titles to adopt it.

Somebody on B3D said they understand TXAA introduces lag (because it uses two frames or whatever) and thus FPS players wont like it.

Always a catch.
 

sp3000

Member
TXAA just seems like another temporal solution that blurs the screen. I'd rather have a few jaggies on my screen than the whole thing looking like it was smeared like Crysis 2.
 

squidyj

Member

I think that matches up with Lottes' quote about the 680 and the architecture and whatnot. If Unreal 4 is being designed bottom up for next gen they're going to be able to do a lot more and take a lot better advantage of cards like the 680 than they would for something on Unreal 3 (even modified) which is tied to the boat anchor that is the current gen console.
 

KKRT00

Member
TXAA just seems like another temporal solution that blurs the screen. I'd rather have a few jaggies on my screen than the whole thing looking like it was smeared like Crysis 2.

Where did You get that TXAA blurs screen? You know that proper shader AA dont need to blur textures? You already have exampled for that in MLAA, SMAA, FXAA 4 and TXAA is based on FXAA 4.
 
Top Bottom