Nikodemos
Member
That could actually be one heckuva thing for bluray playback. You'd get 48 fps clarity without the crappy soap opera effect.I realized that another benefit of FreeSync is 24 fps support for video.
That could actually be one heckuva thing for bluray playback. You'd get 48 fps clarity without the crappy soap opera effect.I realized that another benefit of FreeSync is 24 fps support for video.
By spanning you mean something like an Eyefinity (or Nvidia's equivalent) setup? I'm not using anything like that. I just use one monitor mainly for games and movies, the second for browser and other stuff.
You can already do that on most PC displays (by creating 24 or 48 Hz custom resolutions and configuring your player to use them).That could actually be one heckuva thing for bluray playback. You'd get 48 fps clarity without the crappy soap opera effect.
G-Sync will always be more expensive than purely-software driven solutions like FreeSync (which builds on AdaptiveSync's already existing instructions). Sync chip will always be more expensive than no sync chip.
I don't see how an older, different architecture not supporting a more modern feature can be construed as "hardware-driven". After all, G-Sync is hardware driven not because Fermi doesn't support it (it doesn't), but because it relies on an nVidia sync chip.Freesync isn't software driven even on AMDs hardware. You can't use older gpus.
But that means they'd have to pay nVidia royalties for the sync chips installed in their displays. And somehow I doubt paying nVidia is cheaper in the long run.More importantly the reason Gsync took is that Nvidia took on the responsibility of making the hardware needed.
Oddly enough, AMD are dead-quiet about desktop Carrizo. It isn't known when they'll be brought forth. All information points to mobile being launched first.I'm looking forward to seeing how the Carrizo APU for desktop shapes up. I've been in the market for a HTPC with some light gaming.
Oddly enough, AMD are dead-quiet about desktop Carrizo. It isn't known when they'll be brought forth. All information points to mobile being launched first.
It's highly unlikely to have DDR4 support. HBM is unknown, but unlikely outside at most a couple experimental designs from the likes of HP.I saw somewhere the max TDP will be 64W. That's down a lot for 95W. No word on DDR4 support or if it'll have embedded SDRAM.
I don't see how an older, different architecture not supporting a more modern feature can be construed as "hardware-driven". After all, G-Sync is hardware driven not because Fermi doesn't support it (it doesn't), but because it relies on an nVidia sync chip.
Virtually all monitors feature circuitry known as a “scaler,” which governs the interaction between the graphics card and the physical LCD panel. This essential circuitry also provides user-facing features like: audio output capabilities, display interfaces (e.g. DisplayPort™ or HDMI®), and the “OSD” (settings menu). AMDhas recently entered collaboration​​ with the industry’s largest scaler vendors (MStar, Novatek and Realtek) to create a range of monitor scalers ready for DisplayPort™ Adaptive-Sync by year end; these scalers will pave the way for retail monitor designs that offer compatibility with AMD’s Project FreeSync in 1Q15.​​
But that means they'd have to pay nVidia royalties for the sync chips installed in their displays. And somehow I doubt paying nVidia is cheaper in the long run.
Is Nvidia _required_ to support freesync if they want to support Displayport 1.2a or higher? Is that part of the VESA standard for 1.2a a requirement for implementors?
1.2a and 1.3 provide an optional spec for Adaptive-Vsync, but it isn't mandatory.Is Nvidia _required_ to support freesync if they want to support Displayport 1.2a or higher? Is that part of the VESA standard for 1.2a a requirement for implementors?
No, it is not. AdaptiveSync is baked into DP 1.2a and higher. What is required is testing by VESA of a particular display configuration to see whether it passes A-Sync certification or not. The actual standard is royalty-free.It's good news that the vendors are on board, but hardware is required on the display side for Adaptive-Sync support.
As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.
“VESA is constantly evaluating new methods and technologies that add value to both the end user and our OEM member companies. Adaptive-Sync delivers clearly visible advantages to the user for gaming and live video, and contributes to the development of sleeker mobile system designs by reducing battery power requirements, ”said Bill Lempesis, VESA Executive Director. “VESA has developed a test specification to certify Adaptive-Sync compliance. Systems that pass Adaptive-Sync compliance testing will be allowed to feature the official Adaptive-Sync logo on their packaging, informing consumers which DisplayPort-certified displays and video sources offer Adaptive-Sync.”
Implementation of DisplayPort Adaptive-Sync is offered to VESA members without any license fee.
Whereas for G-Sync (proprietary) hardware is required on both GPU and display side.Hardware is required on the GPU side for FreeSync.
That's exactly what I'm doing and I've noticed zero problems.
I saw somewhere the max TDP will be 64W. That's down a lot for 95W. No word on DDR4 support or if it'll have embedded SDRAM.
The top end of Kaveri is still too expensive. For me to even care about Carrizo desktop APU, the entire computer would need to be $200.
The problem is that there are way better machines to build in the $400~ space, and Intel and MS are being crazy competitive with Z3785 atom.
Walmart is going to have a $100 Win8.1 tablet (with HDMI out) that uses the Z3785 which really is the best bang for the buck HTPC and also doubles AS A TABLET.
It is, since ist just an optional extension to the standard.No, its is not. AdaptiveSync is baked into DP 1.2a and higher.
No, its is not. AdaptiveSync is baked into DP 1.2a and higher. What is required is testing by VESA of a particular display configuration to see whether it passes A-Sync certification or not. The actual standard is royalty-free.
From the official A-Sync statement:
Whereas for G-Sync (proprietary) hardware is required on both GPU and display side.
Virtually all monitors feature circuitry known as a scaler, which governs the interaction between the graphics card and the physical LCD panel. This essential circuitry also provides user-facing features like: audio output capabilities, display interfaces (e.g. DisplayPort or HDMI®), and the OSD (settings menu). AMDhas recently entered collaboration​​ with the industrys largest scaler vendors (MStar, Novatek and Realtek) to create a range of monitor scalers ready for DisplayPort Adaptive-Sync by year end; these scalers will pave the way for retail monitor designs that offer compatibility with AMDs Project FreeSync in 1Q15.​​
Is AMD ending their FX line of desktop CPUs?
No disrespect intended, but that statement makes no logical sense. A display requires a scaler the same way a car requires an engine. Without one, all you have is a panel, a light source and a power brick. A-Sync is already present in the latest standard required from newly-installed VESA-compliant display scalers, all a maker needs is to pass the VESA test for the little sticker on their product.I'm not talking about royalties or how much is paid to whom. I'm taking issue with your statement that FreeSync is a "purely software-driven solution" when it relies on Adaptive-Sync, which itself requires a compatible scaler.
AM3+ is deader than a slug in a salt mine. If there will ever be another big CPU from AMD, it will definitely feature a new non-backwards compatible socket. Though I find it unlikely for them to make pure CPUs. After all, Intel doesn't anymore.It's technically been dead since Piledriver back in 2012. Everything else since then has just been better binned chips. Perhaps when the new x86 Zen architecture drops they'll revive the FX line, but I have a feeling in the end it'll all just be APU's.
It's technically been dead since Piledriver back in 2012. Everything else since then has just been better binned chips. Perhaps when the new x86 Zen architecture drops they'll revive the FX line, but I have a feeling in the end it'll all just be APU's.
AM3+ is deader than a slug in a salt mine. If there will ever be another big CPU from AMD, it will definitely feature a new non-backwards compatible socket. Though I find it unlikely for them to make pure CPUs. After all, Intel doesn't anymore.
Agreed. It doesn't even have PCIE 3.0. My best guess is indeed a new socket, full on SoC, HSA, etc. Zen will be focused on IPC count instead of a CMT type design.
No disrespect intended, but that statement makes no logical sense. A display requires a scaler the same way a car requires an engine. Without one, all you have is a panel, a light source and a power brick.
I wish that would work on my monitor or laptops.You can already do that on most PC displays (by creating 24 or 48 Hz custom resolutions and configuring your player to use them).
no, it's optional.Is Nvidia _required_ to support freesync if they want to support Displayport 1.2a or higher? Is that part of the VESA standard for 1.2a a requirement for implementors?
The Panasonic WT600 from 2013 had DP 1.2a in addition to HDMI 2.0. I'm pretty certain several new(er) displays featuring DP have already implemented 1.2a. The only ones which definitely haven't are nVidia themselves (their chip offers only DP 1.2, deliberately locking out A-Sync). Of course, that won't help them too much in the future since Adaptive Sync is automatically implemented in DP 1.3 rather than optionally certifiable as in 1.2a.I'll put it another way. My point is the Adaptive-Sync compatible scalers don't exist yet.
And, since nVidia love high-value-added stuff (and their scaler is reputedly more complex), which approach do you think is cheaper? Especially since DP 1.2a displays are likely already out there (AMD have stated that some existing displays are already compliant) and cost considerably less than nVidia's sync?They either have to choose to invest in development of their own Adaptive-Sync scalers, or pay NVIDIA for G-Sync modules
Unless something changed recently, adaptive sync is an optional part of the standard.No disrespect intended, but that statement makes no logical sense. A display requires a scaler the same way a car requires an engine. Without one, all you have is a panel, a light source and a power brick. A-Sync is already present in the latest standard required from newly-installed VESA-compliant display scalers, all a maker needs is to pass the VESA test for the little sticker on their product.
That doesn't mean it has adaptive sync support. I think riflen is correct, there is no display product which supports it on the market.The Panasonic WT600 from 2013 had DP 1.2a in addition to HDMI 2.0.
Well you're half right. On PDXLAN they confirmed one vendor has already shipped AdaptiveSync capable monitors.That doesn't mean it has adaptive sync support. I think riflen is correct, there is no display product which supports it on the market.
Sadly it will help them, because it's in the same fashion optional.Of course, that won't help them too much in the future since Adaptive Sync is automatically implemented in DP 1.3 rather than optionally certifiable as in 1.2a.
The Panasonic WT600 from 2013 had DP 1.2a in addition to HDMI 2.0. I'm pretty certain several new(er) displays featuring DP have already implemented 1.2a. The only ones which definitely haven't are nVidia themselves (their chip offers only DP 1.2, deliberately locking out A-Sync). Of course, that won't help them too much in the future since Adaptive Sync is automatically implemented in DP 1.3 rather than optionally certifiable as in 1.2a.
And, since nVidia love high-value-added stuff (and their scaler is reputedly more complex), which approach do you think is cheaper? Especially since DP 1.2a displays are likely already out there (AMD have stated that some existing displays are already compliant) and cost considerably less than nVidia's sync?
Were it optional, nVidia wouldn't have specifically installed DP 1.2 instead of 1.2a in their scaler chip in order to lock-out A-Sync functionality.Unless something changed recently, adaptive sync is an optional part of the standard.
There are various rumours on the net, including some which claim that AMD will skip desktop based Excavator entirely. There are a ton of unknowns.Is it just me, or is there very little information about Desktop Carrizo and this is just Mobile (laptop) Carrizo?
They don't specifically installed DP1.2, 1.2a came after and when they want to, they can have G-Sync with 1.2a too.Were it optional, nVidia wouldn't have specifically installed DP 1.2 instead of 1.2a in their scaler chip in order to lock-out A-Sync functionality.
A-Sync can probably be brute-forced active in every single 1.2a, it's just that it is not 'officially' recognised as supported due to various reasons.
Wow, are you ever full of shit.Were it optional, nVidia wouldn't have specifically installed DP 1.2 instead of 1.2a in their scaler chip in order to lock-out A-Sync functionality.
A-Sync can probably be brute-forced active in every single 1.2a, it's just that it is not 'officially' recognised as supported due to various reasons.
http://www.anandtech.com/show/8533/vesa-releases-displayport-13-standard-50-more-bandwidth-new-features said:Meanwhile to no surprise (but always good for clarification), DisplayPort Active-Sync remains an optional part of the specification, so Adaptive-Sync availability will continue to be on a monitor-by-monitor basis as a premium feature.
nVidia deliberately locked A-Sync out. It's fully coherent with their regular M.O. Close that garden as much as you can.I don't know why Nvidia is so slow at this point, but it's their choice.
I know that Nvidia is locking out A-Sync, but my simple statemant is still that A-Sync is optional in 1.2a and 1.3.Please. nVidia deliberately locked A-Sync out. It's fully coherent with their regular M.O. Close that garden as much as you can.
http://tech4gamers.com/nvidia-says-no-to-displayport-1-2a-and-vesa-adaptive-sync/
Then Adaptive sync is the biggest news with the DisplayPort 1.2a, which is almost identical to the regular 1.2, So there is simply no reason for Nvidia to advertise or indeed even to implement this support. G-Sync will be Nvidia’s commitment to screen synchronization future, which means that Nvidia has no reason to update the interface until the next major revision, which becomes 1.3.
I refuse to build a APU desktop. If the FX line of CPUs dont make a comeback then Ill be investing in a Intel CPU in the future.
God dammit, I hope HDMI adds freesync capability to the new specs... I'll be damned if we don't have variable refresh rate on PS5..
Is AMD ending their FX line of desktop CPUs?
In addition, Capcom announced its collaboration with AMD on the AMD Mantle API to enhance Capcoms Panta-Rhei engine, enabling enhanced gaming performance and visual quality for upcoming Capcom game titles.
This will improve the performance of our Panta-Rhei engine, which was originally developed for console platforms, said Masaru Ijuin, technical director, Capcom. Capcom is evaluating AMDs Mantle technology to help improve the graphics pipeline, and integrate it into Panta-Rhei to provide outstanding benefits and impressive performance for gamers as well as the gaming developers.
Nvidia, the leader in proprietary tech does the same. I don't see you pointing these inconsistencies out for them, what gives?It's a bit amusing to see AMD pushing Mantle (for its great low-level performance on their GPUs!) and Freesync (because it's based on an independent standard!) on the same day. Of course, all for-profit companies are caught in inconsistencies like that at some points.
From the AMD newsroom
I'm an nvidia guy and a free with the sentiments regarding this. Not necessarily piggy backing and hopping on his case bur let's step back here and see what Freeaync is when it hits.Nvidia, the leader in proprietary tech does the same. I don't see you pointing these inconsistencies out for them, what gives?