Today at AMD's Future of Compute event in Singapore, AMD announced partnerships with several companies. One of the more noteworthy announcements is that Samsung will be making FreeSync enabled displays that should be available in March 2015. The displays consist of the 23.6" and 28" UD590, and there will be 23.6", 28", and 31.5" variants of the UE850. These are all UHD (4K) displays, and Samsung has stated their intention to support Adaptive-Sync (and thereby FreeSync) on all of their UHD displays in the future.

FreeSync is AMD's alternative to NVIDIA's G-SYNC, with a few key differences. The biggest difference is that AMD proposed an extension to DisplayPort called Adaptive-Sync, and the VESA group accepted this extension as an amendment to the DisplayPort 1.2a specifications. Adaptive-Sync is thus an open standard that FreeSync leverages to enable variable refresh rates. As far as system requirements for FreeSync, other than a display that supports DisplayPort Adaptive-Sync, you need a supported AMD GPU with a DisplayPort connection and a driver from AMD with FreeSync support.

FreeSync is also royalty free, which should help manufacturers in controlling costs on FreeSync capable displays. There are other costs to creating a display that can support Adaptive-Sync, naturally, so we wouldn't expect price parity with existing LCDs in the near term. On the FreeSync FAQ, AMD notes that the manufacturing and validation requirements to support variable refresh rates without visual artifacts are higher than traditional LCDs, and thus cost-sensitive markets will likely hold off on adopting the standard for now. Over time, however, if Adaptive-Sync catches on then economies of scale come into play and we could see widespread adoption.

Being an open standard does have its drawbacks. NVIDIA was able to partner up with companies and develop G-SYNC and deploy it about a year ago, and there are now 4K 60Hz G-SYNC displays (Acer's XB280HK) and QHD 144Hz G-SYNC display (ASUS' ROG Swift PG278Q) that have been shipping for several months. In many ways G-SYNC showed the viability of adaptive refresh rates, but regardless of who gets credit the technology is quite exciting. If Adaptive-Sync does gain traction, as an open standard there's nothing to stop NVIDIA from supporting the technology and altering G-SYNC to work with Adaptive-Sync displays, but we'll have to wait and see on that front.

Pricing for the Samsung displays has not been announced, though the existing UD590 models tend to cost around $600 for the 28" version. I'd expect the Adaptive-Sync enabled monitors to have at least a moderate price premium, but we'll see when they become available some time around March 2015.

Source: AMD

Comments Locked

73 Comments

View All Comments

  • HunterKlynn - Friday, November 21, 2014 - link

    Hush, fanboy.
  • chizow - Saturday, November 22, 2014 - link

    Oh that's clever, would expect nothing less from a fanboy.
  • haukionkannel - Saturday, November 22, 2014 - link

    Intel and all ARM producers are going to use adaptive sync. They have so poor GPU in their systems that they need all help the async is going to give. And it does not even cost them anything. Just new GPU drivers to support that feature!
    But we don't know the quality yeat, so next spring will be interesting!
  • chizow - Saturday, November 22, 2014 - link

    Why would Intel and ARM have any interest in supporting this given it is almost strictly going to be used for gaming and may require additional hardware and software on their end to support Adaptive Sync on their GPUs?

    Given AMD has come forward and claimed Nvidia GPUs cannot support Adaptive Sync the same way their own GPUs can, who is to say Intel or ARM have the requisite support in their own display adapters.

    AMD claiming this is an open standard that everyone is free to use is just another half-truth, a ploy/stall tactic to try and hinder the adoption of G-Sync, but in the end the score will still be:

    Nvidia ~65% +/-5% * % of GeForce Kepler or newer graphics cards
    AMD 30% +/-5% * % of GCN 1.1+ GPUs (Hawaii, Bonaire, Tonga)

    Because no one else will care enough about it to implement and support it.
  • DiHydro - Wednesday, December 3, 2014 - link

    "Why would Intel and ARM have any interest in supporting this given it is almost strictly going to be used for gaming and may require additional hardware and software on their end to support Adaptive Sync on their GPUs?"

    Intel and Arm also service very different markets than NVidia and AMD. They can implement an asynchronous frame rate technology in phones, laptops, commercial displays, and integrated displays in cars and other areas. I could see this being helpful on a mobile display, because then you can have the CPU/GPU compute if the screen does not need to be updated. This frees up CPU/GPU cycles for more performance, or you can idle the CPU/GPU to save power if the screen is not being updated. For embedded solutions, this would allow a larger, higher resolution screen to be driven for a given CPU or GPU. If it is displaying text, the refresh could be ~1 Hz, and if it is video it could bump up to ~24 Hz.

    Therefore, your opinion that Intel and ARM would have no incentive to use this tech is not correct.
  • Despoiler - Friday, November 21, 2014 - link

    Freesync has a far wider frequency range than G-Sync. Freesync is 9-240hz. G-Sync is 30-144hz
  • chizow - Friday, November 21, 2014 - link

    Still have no idea if its true 240Hz input or fake TruMotion ala HDTVs, not to mention anything over 60Hz on the UHD panels these are going into will be useless for about 5 more years as there's not going to be the GPU horsepower to drive such refresh rates.

    My bet is just marketing jargon for fake refresh rates, unlike G-Sync which gets actual 120+ FPS inputs.
  • MrSpadge - Friday, November 21, 2014 - link

    With such Sync technology there's really no need for those super high display frame rates. And if you're concerned about the latency of your inputs: those shouldn't be coupled to the display frame rate anyway.
  • chizow - Thursday, November 20, 2014 - link

    G-Sync makes the Display slave to the GPU with a direct-link, which is part of the reason there is an expensive FPGA and memory buffer (acts as a lookaside buffer), so that the monitor only refreshes when the GPU tells it to.

    The way AMD explained their adaptive sync is that the monitor is still reactively predicting how to adjust frame rates but there is no direct signaling for each frame. Its hard to say for sure though given AMD has changed their tune so many times and never actually demonstrated the tech completely (their live demos were fixed sub-60Hz refresh rates, and not adaptive, as they claimed).
  • jimjamjamie - Thursday, November 20, 2014 - link

    23.6" 4K display? Yes please

Log in

Don't have an account? Sign up now