• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMDs Future of Compute: CARRIZO APU, 5 Samsung UHD FreeSync displays + more

Akronis

Member
By spanning you mean something like an Eyefinity (or Nvidia's equivalent) setup? I'm not using anything like that. I just use one monitor mainly for games and movies, the second for browser and other stuff.

That's exactly what I'm doing and I've noticed zero problems.
 

Durante

Member
It's a bit amusing to see AMD pushing Mantle (for its great low-level performance on their GPUs!) and Freesync (because it's based on an independent standard!) on the same day. Of course, all for-profit companies are caught in inconsistencies like that at some points.

That could actually be one heckuva thing for bluray playback. You'd get 48 fps clarity without the crappy soap opera effect.
You can already do that on most PC displays (by creating 24 or 48 Hz custom resolutions and configuring your player to use them).
 

wildfire

Banned
G-Sync will always be more expensive than purely-software driven solutions like FreeSync (which builds on AdaptiveSync's already existing instructions). Sync chip will always be more expensive than no sync chip.

Freesync isn't software driven even on AMDs hardware. You can't use older gpus.

More importantly the reason Gsync took is that Nvidia took on the responsibility of making the hardware needed. Display manufacturers still have to retool their production process to make the new internal boards which are significantly different enough to be concerned about how much more money they will charge us compared to a display that stick to the old standards.
 

Irobot82

Member
I'm looking forward to seeing how the Carrizo APU for desktop shapes up. I've been in the market for a HTPC with some light gaming.
 

Nikodemos

Member
Freesync isn't software driven even on AMDs hardware. You can't use older gpus.
I don't see how an older, different architecture not supporting a more modern feature can be construed as "hardware-driven". After all, G-Sync is hardware driven not because Fermi doesn't support it (it doesn't), but because it relies on an nVidia sync chip.

More importantly the reason Gsync took is that Nvidia took on the responsibility of making the hardware needed.
But that means they'd have to pay nVidia royalties for the sync chips installed in their displays. And somehow I doubt paying nVidia is cheaper in the long run.

I'm looking forward to seeing how the Carrizo APU for desktop shapes up. I've been in the market for a HTPC with some light gaming.
Oddly enough, AMD are dead-quiet about desktop Carrizo. It isn't known when they'll be brought forth. All information points to mobile being launched first.
 

Phawx

Member
The top end of Kaveri is still too expensive. For me to even care about Carrizo desktop APU, the entire computer would need to be $200.

The problem is that there are way better machines to build in the $400~ space, and Intel and MS are being crazy competitive with Z3785 atom.

Walmart is going to have a $100 Win8.1 tablet (with HDMI out) that uses the Z3785 which really is the best bang for the buck HTPC and also doubles AS A TABLET.
 

Irobot82

Member
Oddly enough, AMD are dead-quiet about desktop Carrizo. It isn't known when they'll be brought forth. All information points to mobile being launched first.

I saw somewhere the max TDP will be 65W. That's down a lot for 95W. No word on DDR4 support or if it'll have embedded SDRAM.
 

Nikodemos

Member
I saw somewhere the max TDP will be 64W. That's down a lot for 95W. No word on DDR4 support or if it'll have embedded SDRAM.
It's highly unlikely to have DDR4 support. HBM is unknown, but unlikely outside at most a couple experimental designs from the likes of HP.
 

mitchman

Gold Member
Is Nvidia _required_ to support freesync if they want to support Displayport 1.2a or higher? Is that part of the VESA standard for 1.2a a requirement for implementors?
 

riflen

Member
I don't see how an older, different architecture not supporting a more modern feature can be construed as "hardware-driven". After all, G-Sync is hardware driven not because Fermi doesn't support it (it doesn't), but because it relies on an nVidia sync chip.

From the FreeSync FAQ.

Virtually all monitors feature circuitry known as a “scaler,” which governs the interaction between the graphics card and the physical LCD panel. This essential circuitry also provides user-facing features like: audio output capabilities, display interfaces (e.g. DisplayPort™ or HDMI®), and the “OSD” (settings menu). AMDhas recently entered collaboration​​ with the industry’s largest scaler vendors (MStar, Novatek and Realtek) to create a range of monitor scalers ready for DisplayPort™ Adaptive-Sync by year end; these scalers will pave the way for retail monitor designs that offer compatibility with AMD’s Project FreeSync in 1Q15.​​

It's good news that the vendors are on board, but hardware is required on the display side for Adaptive-Sync support.

FreeSync (a term which shouldn't be used interchangeably with Adaptive-Sync), is a product feature unique to AMD GPUs that leverages Adaptive-Sync. Hardware is required on the GPU side for FreeSync.

It is at best, inaccurate to state that Adaptive-Sync or FreeSync are software solutions.

But that means they'd have to pay nVidia royalties for the sync chips installed in their displays. And somehow I doubt paying nVidia is cheaper in the long run.

Frankly, no-one here knows what the costing will be like for these displays. Yes, G-Sync displays could be forever more expensive than a comparable FreeSync display. This is a safe assumption right at this moment, given that NVIDIA is still using FPGAs in their current implementation. However, we do not even know if FreeSync will offer all of the benefits that G-Sync does or if NVIDIA will be producing ASICs sometime in the near future.

Is Nvidia _required_ to support freesync if they want to support Displayport 1.2a or higher? Is that part of the VESA standard for 1.2a a requirement for implementors?

You mean Adaptive-Sync, not FreeSync. See above. No, nobody is required to support Adaptive-Sync in their DisplayPort 1.2a+ compatible devices. It's a feature that can be implemented.
 

Locuza

Member
Is Nvidia _required_ to support freesync if they want to support Displayport 1.2a or higher? Is that part of the VESA standard for 1.2a a requirement for implementors?
1.2a and 1.3 provide an optional spec for Adaptive-Vsync, but it isn't mandatory.
Which is sad, because Nvidia will for sure ignore the spec and be an scumbag with G-Sync.
 

Tetranet

Member
Hearing of AMDs new hardware and technologies reminds me some mid 2013 talk about HSA and hUMA on the new consoles. Anyone know for sure what happened with that?

At any rate, more APUs and single-die solutions in general are always welcome.
 
But that would mean buying an AMD card.

Well in that case

28465-Jerry-Seinfeld-leaves-gif-yxqm.gif
 

Nikodemos

Member
It's good news that the vendors are on board, but hardware is required on the display side for Adaptive-Sync support.
No, it is not. AdaptiveSync is baked into DP 1.2a and higher. What is required is testing by VESA of a particular display configuration to see whether it passes A-Sync certification or not. The actual standard is royalty-free.

From the official A-Sync statement:

As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.

“VESA is constantly evaluating new methods and technologies that add value to both the end user and our OEM member companies. Adaptive-Sync delivers clearly visible advantages to the user for gaming and live video, and contributes to the development of sleeker mobile system designs by reducing battery power requirements, ”said Bill Lempesis, VESA Executive Director. “VESA has developed a test specification to certify Adaptive-Sync compliance. Systems that pass Adaptive-Sync compliance testing will be allowed to feature the official Adaptive-Sync logo on their packaging, informing consumers which DisplayPort-certified displays and video sources offer Adaptive-Sync.”

Implementation of DisplayPort Adaptive-Sync is offered to VESA members without any license fee.


Hardware is required on the GPU side for FreeSync.
Whereas for G-Sync (proprietary) hardware is required on both GPU and display side.
 

Nachtmaer

Member
That's exactly what I'm doing and I've noticed zero problems.

I'll keep it in mind. Like I said before, some people had no issues and some experienced stuttering when moving things from one monitor to another.

I saw somewhere the max TDP will be 64W. That's down a lot for 95W. No word on DDR4 support or if it'll have embedded SDRAM.

I think Zen will be the first architecture to be supporting DDR4. By then Intel will also be using it for their mainstream platforms.
 
Guess I'll be buying a samsung monitor next. Non proprietary is the way to go. Interested in seeing their GPU lineup and pricing as well.
 

wapplew

Member
The top end of Kaveri is still too expensive. For me to even care about Carrizo desktop APU, the entire computer would need to be $200.

The problem is that there are way better machines to build in the $400~ space, and Intel and MS are being crazy competitive with Z3785 atom.

Walmart is going to have a $100 Win8.1 tablet (with HDMI out) that uses the Z3785 which really is the best bang for the buck HTPC and also doubles AS A TABLET.

I just build one with A10 7850k, way too expensive compare to other option.
Board cost more, need better ram to get full graphics capability. Even Intel NUC is cheaper.
 

riflen

Member
No, its is not. AdaptiveSync is baked into DP 1.2a and higher. What is required is testing by VESA of a particular display configuration to see whether it passes A-Sync certification or not. The actual standard is royalty-free.

From the official A-Sync statement:





Whereas for G-Sync (proprietary) hardware is required on both GPU and display side.

I'm not talking about royalties or how much is paid to whom. I'm taking issue with your statement that FreeSync is a "purely software-driven solution" when it relies on Adaptive-Sync, which itself requires a compatible scaler.

Virtually all monitors feature circuitry known as a “scaler,” which governs the interaction between the graphics card and the physical LCD panel. This essential circuitry also provides user-facing features like: audio output capabilities, display interfaces (e.g. DisplayPort™ or HDMI®), and the “OSD” (settings menu). AMDhas recently entered collaboration​​ with the industry’s largest scaler vendors (MStar, Novatek and Realtek) to create a range of monitor scalers ready for DisplayPort™ Adaptive-Sync by year end; these scalers will pave the way for retail monitor designs that offer compatibility with AMD’s Project FreeSync in 1Q15.​​

Now you can say that there are no royalties for the scaler vendors with reference to Adaptive-Sync and you'd be right, but you cannot say that FreeSync is a "purely software-driven solution". People will have to buy new monitors to use FreeSync.
 

Irobot82

Member
Is AMD ending their FX line of desktop CPUs?

It's technically been dead since Piledriver back in 2012. Everything else since then has just been better binned chips. Perhaps when the new x86 Zen architecture drops they'll revive the FX line, but I have a feeling in the end it'll all just be APU's.
 

Nikodemos

Member
I'm not talking about royalties or how much is paid to whom. I'm taking issue with your statement that FreeSync is a "purely software-driven solution" when it relies on Adaptive-Sync, which itself requires a compatible scaler.
No disrespect intended, but that statement makes no logical sense. A display requires a scaler the same way a car requires an engine. Without one, all you have is a panel, a light source and a power brick. A-Sync is already present in the latest standard required from newly-installed VESA-compliant display scalers, all a maker needs is to pass the VESA test for the little sticker on their product.

It's technically been dead since Piledriver back in 2012. Everything else since then has just been better binned chips. Perhaps when the new x86 Zen architecture drops they'll revive the FX line, but I have a feeling in the end it'll all just be APU's.
AM3+ is deader than a slug in a salt mine. If there will ever be another big CPU from AMD, it will definitely feature a new non-backwards compatible socket. Though I find it unlikely for them to make pure CPUs. After all, Intel doesn't anymore.
 
It's technically been dead since Piledriver back in 2012. Everything else since then has just been better binned chips. Perhaps when the new x86 Zen architecture drops they'll revive the FX line, but I have a feeling in the end it'll all just be APU's.

I refuse to build a APU desktop. If the FX line of CPUs dont make a comeback then Ill be investing in a Intel CPU in the future.
 

Irobot82

Member
AM3+ is deader than a slug in a salt mine. If there will ever be another big CPU from AMD, it will definitely feature a new non-backwards compatible socket. Though I find it unlikely for them to make pure CPUs. After all, Intel doesn't anymore.

Agreed. It doesn't even have PCIE 3.0. My best guess is indeed a new socket, full on SoC, HSA, etc. Zen will be focused on IPC count instead of a CMT type design.
 

riflen

Member
No disrespect intended, but that statement makes no logical sense. A display requires a scaler the same way a car requires an engine. Without one, all you have is a panel, a light source and a power brick.

I'll put it another way. My point is the Adaptive-Sync compatible scalers don't exist yet. People will have to buy a display with a compatible scaler, hence there's a hardware requirement for people to use be able to use FreeSync.

In the current G-Sync implementation, the module replaces the scaler circuitry in the display. It's not an extra sync chip working alongside the scaler. Instead, any scaling is taken care of by the GPU.

So, the only real differences between the two approaches is for the display vendors. They either have to choose to invest in development of their own Adaptive-Sync scalers, or pay NVIDIA for G-Sync modules on the models they wish to market as variable refresh capable. Both approaches rely on a combination of hardware and software.
 

Nikodemos

Member
I'll put it another way. My point is the Adaptive-Sync compatible scalers don't exist yet.
The Panasonic WT600 from 2013 had DP 1.2a in addition to HDMI 2.0. I'm pretty certain several new(er) displays featuring DP have already implemented 1.2a. The only ones which definitely haven't are nVidia themselves (their chip offers only DP 1.2, deliberately locking out A-Sync). Of course, that won't help them too much in the future since Adaptive Sync is automatically implemented in DP 1.3 rather than optionally certifiable as in 1.2a.

They either have to choose to invest in development of their own Adaptive-Sync scalers, or pay NVIDIA for G-Sync modules
And, since nVidia love high-value-added stuff (and their scaler is reputedly more complex), which approach do you think is cheaper? Especially since DP 1.2a displays are likely already out there (AMD have stated that some existing displays are already compliant) and cost considerably less than nVidia's sync?
 

Durante

Member
No disrespect intended, but that statement makes no logical sense. A display requires a scaler the same way a car requires an engine. Without one, all you have is a panel, a light source and a power brick. A-Sync is already present in the latest standard required from newly-installed VESA-compliant display scalers, all a maker needs is to pass the VESA test for the little sticker on their product.
Unless something changed recently, adaptive sync is an optional part of the standard.

The Panasonic WT600 from 2013 had DP 1.2a in addition to HDMI 2.0.
That doesn't mean it has adaptive sync support. I think riflen is correct, there is no display product which supports it on the market.
 
That doesn't mean it has adaptive sync support. I think riflen is correct, there is no display product which supports it on the market.
Well you're half right. On PDXLAN they confirmed one vendor has already shipped AdaptiveSync capable monitors.
But they haven't released the FreeSync drivers yet, so you could say it's still not supported.
 

SRG01

Member
Is it just me, or is there very little information about Desktop Carrizo and this is just Mobile (laptop) Carrizo?
 

Locuza

Member
Of course, that won't help them too much in the future since Adaptive Sync is automatically implemented in DP 1.3 rather than optionally certifiable as in 1.2a.
Sadly it will help them, because it's in the same fashion optional.
If you want the adaptivesync logo, you have to pass the test, but if you don't nobody cares because it's not mandatory.
 

riflen

Member
The Panasonic WT600 from 2013 had DP 1.2a in addition to HDMI 2.0. I'm pretty certain several new(er) displays featuring DP have already implemented 1.2a. The only ones which definitely haven't are nVidia themselves (their chip offers only DP 1.2, deliberately locking out A-Sync). Of course, that won't help them too much in the future since Adaptive Sync is automatically implemented in DP 1.3 rather than optionally certifiable as in 1.2a.


And, since nVidia love high-value-added stuff (and their scaler is reputedly more complex), which approach do you think is cheaper? Especially since DP 1.2a displays are likely already out there (AMD have stated that some existing displays are already compliant) and cost considerably less than nVidia's sync?

You seem to mistake me for someone with an agenda. I'm not interesting in taking sides and arguing about evil GPU vendors. I made no statement about the comparative costings of either solution and I don't care to. I posted simply to correct your factually inaccurate statement about FreeSync in the vain hope that people might be better informed on what is a rather confusing topic. Now that I've done that, I'm out.
 

Nikodemos

Member
Unless something changed recently, adaptive sync is an optional part of the standard.
Were it optional, nVidia wouldn't have specifically installed DP 1.2 instead of 1.2a in their scaler chip in order to lock-out A-Sync functionality.

A-Sync can probably be brute-forced active in every single 1.2a, it's just that it is not 'officially' recognised as supported due to various reasons.

Is it just me, or is there very little information about Desktop Carrizo and this is just Mobile (laptop) Carrizo?
There are various rumours on the net, including some which claim that AMD will skip desktop based Excavator entirely. There are a ton of unknowns.

However, I've found this little snippet regarding their upcoming video cards:

http://videocardz.com/54013/amd-fiji-xt-spotted-at-zauba
 

Locuza

Member
Were it optional, nVidia wouldn't have specifically installed DP 1.2 instead of 1.2a in their scaler chip in order to lock-out A-Sync functionality.

A-Sync can probably be brute-forced active in every single 1.2a, it's just that it is not 'officially' recognised as supported due to various reasons.
They don't specifically installed DP1.2, 1.2a came after and when they want to, they can have G-Sync with 1.2a too.
I don't know why Nvidia is so slow at this point, but it's their choice.
 

Durante

Member
Were it optional, nVidia wouldn't have specifically installed DP 1.2 instead of 1.2a in their scaler chip in order to lock-out A-Sync functionality.

A-Sync can probably be brute-forced active in every single 1.2a, it's just that it is not 'officially' recognised as supported due to various reasons.
Wow, are you ever full of shit.

http://www.anandtech.com/show/8533/vesa-releases-displayport-13-standard-50-more-bandwidth-new-features said:
Meanwhile to no surprise (but always good for clarification), DisplayPort Active-Sync remains an optional part of the specification, so Adaptive-Sync availability will continue to be on a monitor-by-monitor basis as a premium feature.

So not only is it obviously optional in 1.2a, it also remains optional in 1.3. The fun part is, I'd be thrilled if adaptive sync were non-optional, because it would be fantastic for me as someone interested in having a selection of great gaming display devices available. But it's not.

And of course, your fantasy about the large-scale "brute-forcing" of something the hardware simply isn't at all built to do is just that: a fantasy.
 

Exuro

Member
So are there any reports/first looks on freesync? I'm interested to see how it performs vs gsync. I'm guessing it's probably better to wait for this type of article when the monitors actually release.
 

Locuza

Member
Please. nVidia deliberately locked A-Sync out. It's fully coherent with their regular M.O. Close that garden as much as you can.

http://tech4gamers.com/nvidia-says-no-to-displayport-1-2a-and-vesa-adaptive-sync/
I know that Nvidia is locking out A-Sync, but my simple statemant is still that A-Sync is optional in 1.2a and 1.3.
And i also fully agree on the term, that Nvidias politic is just anti-consumer.

If i quote your source:
Then Adaptive sync is the biggest news with the DisplayPort 1.2a, which is almost identical to the regular 1.2, So there is simply no reason for Nvidia to advertise or indeed even to implement this support. G-Sync will be Nvidia’s commitment to screen synchronization future, which means that Nvidia has no reason to update the interface until the next major revision, which becomes 1.3.

In the end, Nvidia simply doesn't care about 1.2a and if A-Sync is the only major component wich came with the spec revision, i understand why Nvidia publish negative pr.
But neither 1.2a nor 1.3 will force Nvidia to support it, as long as it remains optional.
 

hesido

Member
God dammit, I hope HDMI adds freesync capability to the new specs... I'll be damned if we don't have variable refresh rate on PS5..
 
The real question is whether Free Sync will be just as good as G-sync. Which we will need to wait for testing on; it's not like AMD would come out and say that it's inferior in some way.

God dammit, I hope HDMI adds freesync capability to the new specs... I'll be damned if we don't have variable refresh rate on PS5..

Maybe they'll add DP to the consoles?

LOL nah.
 

Joezie

Member
From the AMD newsroom
In addition, Capcom announced its collaboration with AMD on the AMD Mantle API to enhance Capcom’s “Panta-Rhei” engine, enabling enhanced gaming performance and visual quality for upcoming Capcom game titles.

“This will improve the performance of our ‘Panta-Rhei’ engine, which was originally developed for console platforms,” said Masaru Ijuin, technical director, Capcom. “Capcom is evaluating AMD’s Mantle technology to help improve the graphics pipeline, and integrate it into ‘Panta-Rhei’ to provide outstanding benefits and impressive performance for gamers as well as the gaming developers.”
 

wachie

Member
It's a bit amusing to see AMD pushing Mantle (for its great low-level performance on their GPUs!) and Freesync (because it's based on an independent standard!) on the same day. Of course, all for-profit companies are caught in inconsistencies like that at some points.
Nvidia, the leader in proprietary tech does the same. I don't see you pointing these inconsistencies out for them, what gives?
 

LiquidMetal14

hide your water-based mammals
Nvidia, the leader in proprietary tech does the same. I don't see you pointing these inconsistencies out for them, what gives?
I'm an nvidia guy and a free with the sentiments regarding this. Not necessarily piggy backing and hopping on his case bur let's step back here and see what Freeaync is when it hits.

I just want a competitive market.
 
Top Bottom