Back when the 8800GT was launched, 512MB of video memory is all we needed. To have 2GB, 3GB or 4GB on a video card seemed like overkill. Fast forward several years and those latter capacities are now the norm for anyone spending $100+ on a discrete graphics card. So does an 8GB graphics card seem like overkill today?  When I look at 4K UHD panels slowly entering the market, and the prevalence for high end gamers to run multi-monitor setups, I would postulate that 8GB of video memory is only the start, especially as memory manufacturing and die shrinking continues. To that end, and perhaps to the delight of AMD-based compute, Sapphire showed off two R9 290X models at CeBIT this year to Kitguru, both with 8GB of GDDR5.

First up is the Vapor-X model, featuring a 2.5 slot cooler with three fans in a blue shroud. The second is the Toxic Edition, which to the eye looks identical apart from the yellow coloring.   

Video outputs seem to be a dual DVI, HDMI and DisplayPort. No word on GPU or memory frequencies; however given their branding I would expect the Toxic to be a higher clocked part. For reference, the R9 280X Toxic is at 1100 MHz core (1150 MHz boost), whereas the R9 280X Vapor-X is at 950 MHz core (1070 MHz boost).  Ryan reviewied the R9 280X Toxic at the end of last year - read his review here. These are also set to be limited edition models, pricing and release date as yet unknown but due to the higher memory cost these cards will pack a price premium.   The on/off button at the rear of both of these cards may be their dual VBIOS functionality - if users are in a UEFI environment, this will switch the BIOS to the UEFI mode for faster booting.

It is a shame that these are going to be limited edition cards, as it does offer that bridge between a high end consumer compute card with a lot of memory and the full-blown AMD FirePro branding. Even saying that, the number of FirePro cards with more than 8GB is rather limiting – an S10000 12GB was announced at the end of last year and further down the product stack there are a few 6 GB models.

Source: Kitguru [1,2]

Comments Locked


View All Comments

  • Mr Perfect - Thursday, March 13, 2014 - link

    Considering the PS4 and XBox 1 both have 8GB of ram each, I was a little surprised that the new videocards where coming out with only 4GB. Granted, it's 8GB of shared ram in the consoles, but next gen games are going to start taking large pools of ram for granted now.
  • piroroadkill - Thursday, March 13, 2014 - link

    Nah, Xbox 1 has 64MiB RAM.

    Xbox One has 8GiB unified.
  • ImSpartacus - Thursday, March 13, 2014 - link

    The consoles use that RAM for the entire system, not just the GPU.
  • rish95 - Thursday, March 13, 2014 - link

    Do the PS4 and Xbox One even have the horsepower to make good use of 8GB of RAM?
  • nathanddrews - Friday, March 14, 2014 - link

    It's probably better to have too much and not need it than to not have enough and need it. They did it due to pressure from developers. If developers find a way to use it all, the PS4 will have a distinct advantage.
  • EzioAs - Thursday, March 13, 2014 - link

    So these kids, I'm guessing they used their parent's money as well to buy 3 2560x1600 monitors to actually utilize the 8GB memory that comes with these cards?
  • nathanddrews - Friday, March 14, 2014 - link

    That's not quite fair. You assume it's worthless and bought for spoiled rich kids.

    First, it's been confirmed through testing that you need ~4GB if you plan on doing at 4xAA at 4K, depending on the game. If you start playing with higher settings, higher-res texture packs, and newer games designed to take advantage of more VRAM (thanks Xbone/PS4), then 8GB won't seem too extreme.

    Second, you assume that only gamers want this, but there are many non-gaming applications that will eat up every last bit of VRAM you can throw at them and many consumers unwilling to spend the incredible premium on workstation cards.

  • The Von Matrices - Thursday, March 13, 2014 - link

    Do these cards use 16 4Gb modules or 32 2Gb modules?
  • Frenetic Pony - Thursday, March 13, 2014 - link

    Exactly what I've been waiting for, dedicated next gen games should start eating up video ram once they drop last gen as their minimum spec. Future proof gaming here we are!
  • MrSpadge - Friday, March 14, 2014 - link

    Sure... with higher end Maxells and 20 nm almost around the next corner.

Log in

Don't have an account? Sign up now