• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD's "tressfx" unveiled; Lara can take care of her own hair [DX11 compute]

Its almost 1:1 on how the hair flows, look closely, exact the same haircut, only different techniques

he means not even close because the fibers that make the CG model are individual, while this still has a "grouping" feel. But yeah, in this precise shot it's almost a 1:1 copy, except for the ponytail.
 

Router

Hopsiah the Kanga-Jew
As long as it moves and reacts something close to hair then great. If it bounces around looking stupid... :(
 

DieH@rd

Banned
e70a931127.png


:D
 

Durante

Member
I wonder how terrible the implementation on non-AMD GPUs will be, and whether NV will bother to do driver-level code replacement.
 
Screenshots look good but that doesn't really mean much. I can't wait to see it animating. I really hope it works well on all newer cards, Nvidia and AMD. If it doesn't it'll never catch on and will be a waste of time, implemented in a few games because AMD incentivised its half-assed usage.
 

K' Dash

Member
I said wow.

lara_3frame7gexo.jpg


It looks really, really great. Honestly I'm more impressed by this news than when the PS4 was unveiled.

What have you done?!

Me too

I preordered for PC as soon as it became available, I knew Eidos would not fail me.
 
It's pretty believable since MS owns directx. I suppose they could easily hide certain features behind the xbox for a time. Very clever, MS.

I would be more hyped about this if I hadn't recently bought a gtx 660ti. I guess it will be cool when I'm ready to buy another videocard.

Ugh I'm in the same boat dude. Let's see how many multiplatform games uses this tech.
 
I wonder how terrible the implementation on non-AMD GPUs will be, and whether NV will bother to do driver-level code replacement.

They cut back pretty significantly on compute performance going from Fermi to Kepler. Ironic, given the way they've crowed about their compute advantage for so long, only to sacrifice it right when it started to become useful for actual games.
 

Eideka

Banned

Durante

Member
Not surprised, while I'm an Nvidia man, AMD are very good at letting everyone use their technology usually. Nvidia does all the cockblocking.
Letting them use it, sure. But I wouldn't expect them to write their compute shaders in a way that works well on NV hardware (and I wouldn't fault them for that). Then again, I guess it's pretty hard to write ones that work well on GCN but totally suck on Kepler.
 

Mononoke

Banned
No, Physx games at high settings run extremely well on my GTX670.

The only game i've had problems with (in regards to Physx) is Arkham City. Even with 680 in SLI, i've had FPS issues.

But any other game that offers it, I've had no issues with it.
 
Top Bottom