The thicker notebooks are a little underspecced in the GPU department. Generally people buy those huge gaming notebooks to get something like the 970m or 980m. If it's just going to have a 960m/965m I don't see the appeal.
I actually prefer to see the 960m in the thin form factor of the G501. Likely far superior battery life under dGPU load, better thermals and noise levels, and a lower price when compared to the 14" Blade. The Blade seems to be really pushing the thermal envelope of its form factor with x70m class GPUs, this seems more practical and balanced. You're not going to be gaming at 4K or 3K anyway, especially not with newer AAA titles, and the native resolution allows for 1/4 1080p scaling, which should be the target gaming resolution on the G501.
The only thing I don't like about it is the x4 SSD... what? Why not just a single SSD with a HDD for storage and pass on the cost savings? Otherwise a great looking notebook.
Edit: PCIe x4 SSD, not x4 SSD's... misread. Still wish it came with an additional HDD for storage, I would think they should have the space for a 2.5" drive in a 15" notebook... but maybe that's why it has a 96 Wh battery.
The 501 with a 970 at the same price makes a fantastic competitor to the RazerBlade. 960M is just too little for the price. I don't understand pairing really high res screens with GPUs that can't handle them natively. Why use 4 drives? That's just two more points for potential drive failure. 2 x 256 makes more sense to me.
I'm looking forward to reviews of the Gigabyte P35Xv3-CF7
Yeah, I don't even consider Apple's notebooks high end. They're all low end or mid range, despite their absurd prices.
Their *current* $2500 notebook has less than 1/3 the power of my *2012* Alienware that I bought for $1700! Plus my system has user replaceable drives, RAM, way more ports, WAAAAAAAAAAY better cooling, etc.
Heck, while my current notebook is powerful enough, I still drool over this year's gear, but $2500 with Apple doesn't buy you 2012's gear!
I've been eyeing buying a G751JY, but I don't really see the point in including a damn VGA-port in a high-end gaming laptop. And an optical drive bay is also kind of redundant, that space could be much better used for, say, additional cooling. (Yes, I know you can remove the drive, but that doesn't remove the bay itself.) Besides, with HEVC slowly gaining traction I wonder if it wouldn't be better to wait for the next NVIDIA GPU to come out which would hopefully have full H/W-accelerated decoding and encoding of it..
FORTUNATELY for Apple, people like you exist... (and I do respect them for creating such a fan base as they have. Brilliant marketing)
I've also noticed a recent trend in denying small chassis laptops a physical ethernet connector - and I'm not saying I like it, but after using one for some months, it is not a problem at all. USB3.0 > Gig Ethernet adapter still has amazing latency, which I definately expected it would not have. Nice to be proven wrong sometimes.
Anyway, don't you have to getting along, and go collect your $17k smartwatch from somewhere?
Are you saying having to use a USB Ethernet adapter increases latency?
At any rate, I use Ethernet 100% of the time, and would be annoyed if a notebook doesn't have it. Heck, I've already got 5 USB ports + Ethernet, and STILL use a 7 port USB hub...even just the issue of having somewhere to stick it in would be bad...
G751 should have a GTX970M in it, or be thin through the whole chassis with the GPU it does have. Glad there's at least one with a 1080p screen. Considering modern mobile GPU's only very recently started becoming able to run games at 1080p I'm really shocked that so many are coming out with above 1080p panels. I'm just over here like "modern GPU's struggle to run games at 1080p even 2 years after release, who the fuck thought it was a good idea to introduce resolutions even higher?!?!"
"I'm just over here like "modern GPU's struggle to run games at 1080p even 2 years after release, who the fuck thought it was a good idea to introduce resolutions even higher?!?!""
You know, you can run games at lower than native resolution. 4K allows for 1/4 scaling to 1080p. And you can still benefit from the higher resolution for everything else.
1. It doesn't look so good 2. I don't want to pay for something I'm not gonna use.
What benefit does the higher resolution provide for "everything else"? Absolutely nothing AFAIC. Makes shit so small you can't read it, fucks up the OS because the DPI is too high. Fucks up every program on the computer because the DPI is too high.
"It doesn't look so good" It shouldn't look any worse than a native 1080p display. That was the point of my previous comment.
"I don't want to pay for something I'm not gonna use." Then you shouldn't get one, especially you're that certain you won't benefit from it. Although I'm sure you're familiar with the general trend towards higher DPI display's. You're going to have less and less of a choice going forward, particularly with notebooks.
"Have you even actually used a 4k screen?" Yes, although admittedly I haven't used a Windows machine with quite that high of a DPI. I've used a 24" 4K monitor and I've tried a Surface Pro a few times. I'm well aware of UI scaling issues throughout Windows 8, and web browsers, and content creation applications, and a lot of other things. I totally agree that it's not ready for prime time. But when scaling works, and when viewing native content at 4K, it looks amazing. I think it can offer a lot of additional fidelity over standard DPI displays, particularly for content consumption on desktop.
In THEORY it seems like if you're 4x 1080p that should be okay (although 1080p is good on a 24" monitor, too high on a 17" one IMO).
But yeah, I don't see the benefit. On a tablet there is only because I'm holding it right up to my face. On a monitor, I'm at least a few feet away, and I think the quality of the panel would make a lot bigger difference than quadrupling the resolution... And stuff's already small enough...
I like the 501 a lot, I believe the price is right for the thinness. I am a developer & do not really care about the fastest GPU but do care about a faster SSD & good RAM (16 is plentiful) but I would have preferred like another commentator had mentioned to have had a slot to put a 2.5" SSD in addition to the PCIe x4 SSD drive.
Can anyone confirm that these actually have Thunderbolt? I'm seeing it mentioned in multiple articles but nowhere on Asus's press releases and the logo next to the port is DisplayPort, not Thunderbolt.
Hey Asus, Get rid of the number pad. What gamer is playing a game called excel?
If i'm paying $2000 for a laptop, I better be sitting dead center of the screen and not 15 degrees off to the left. The G and H should be located in the middle of the keyboard.
Know who finally realized this? The New Alienware 15". They got rid of the numpad.
It's good to see a laptop with a decent battery capacity, and a decent GPU. I use my laptop for both travelling using the iGPU, and gaming when plugged into a wall socket. Having a large battery capacity also comes in helpful when you want to game, but don't have access to a wall socket (3 or so hours is better than 1).
My Ativ Book 8 will do for now, but if AMD release something to compete with Nvidia; then I would be interested to see if ASUS offer an AMD version.
AMD just doesn't have competitive mobile dGPU's due to the simple fact that their dGPU's in general aren't competitive in terms of performance per W, and I doubt this will change much with the upcoming 300 series.
I think it is because we've seen only one new architecture from AMD in their high end mobile for years (early 2012?), and I don't see a Windows system with it.
7970M, clock boost added for some temporary higher clocks, becomes 8970M.
M290X is just a rebrand of the HD 8970M with identical performance. (God I hate that rebranding)
The M295X was 'Tonga', but have you seen it anywhere, for sale? (outside of macs) I haven't. And the performance is below GTX680M as far as I can find.
Actually the M295X is quite powerful. It's not quite as fast as a 970M, but it gets within 10% on average. But ya, it's iMac only at the moment, and honestly I don't see much of a market for it outside high-end AIO's. The problem is the TDP needed to achieve that performance. At ~125W I don't think AMD is even targeting notebooks with this. I'm not sure it's a practical option for even high-end desktop replacements, especially when the 970M has ~80W TDP and performs better.
Really AMD's highest-end notebook dGPU is still the M290X (Pitcairn), and that's completely noncompetitive at this point. The 780M outperforms it...
I think the reality is AMD can't get their act together with drivers, and inexplicably that's FIIIINALLY catching up to them. I swear, 10 years ago the situation was probably even worse, yet people would claim you were a fanboy for pointing it out.
I'm mad that we still have to even talk about drivers. Optimus shouldn't exist. It works better than I'd expect, but still causes issues and just shouldn't be there. GPUS on CPUs are still a terrible idea-wasted die space. ALL PCs should get video driver updates from AMD or Nvidia directly...you shouldn't have to check before you buy a system whether it can. (Dell screwed up the Alpha mini-desktop by not letting Nvidia support it! On a GAMING system?!?)
I should not have to fight to find a system without Optimus and fight to find a system that runs normal drivers in 2015... Why do companies get all of this so very wrong? Why do companies selling systems explicitly marketed for GAMING often even manage to screw it up?
iGPUs and nvidia's optimus are the only reason gaming laptops have useful battery life, and given how a quad core i7 is not only cheaper, but just as fast as a six core i7 in games, the whole wasted die space thing is a dead argument. I agree with you on the driver thing, but that is the OEMs fault, not nvidias. Thus the reason I wont buy an alienware, as tempting as the 13 and 15 are.
i'm with the other commentors, the g501 should at least come with an optional gtx 965 or 970. that said, i don't see how a 4k screen makes sense if you can't drive games on native resolution at all on it. otherwise it seems to be a nice machine. if only they got rid of that fugly logo on the lid.
Hey guys! I bought the GL551JW without the SSD. Just the 1 TB HDD. Will i be able to add a SDD bought separately? I was hoping to buy a SSD for cheaper with more space. I'm just worried that if you don't buy it with the SSD they don't leave you a slot on the laptop. Thanks!
I am very curious about the comparison between GTX 960m 4 GB (on the Asus ROG G501jw) vs. GTX 970m 3 Gb (Razer Blade 14 2015). On paper and all else equal, the G501jw beats the Razer Blade 14 on many features: higher resolution + matte screen, more connectivity, better keyboard layout (need the right-click button for Excel spreadsheets). Really looking forward to this review testing on the quality of the screen, especially contrast ratio and color accuracy.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
56 Comments
Back to Article
Flunk - Friday, March 13, 2015 - link
The thicker notebooks are a little underspecced in the GPU department. Generally people buy those huge gaming notebooks to get something like the 970m or 980m. If it's just going to have a 960m/965m I don't see the appeal.ingwe - Friday, March 13, 2015 - link
Totally agree, though I am not into desktop replacements. If I would want to carry around a huge laptop, I would want it to be top of the line.eanazag - Friday, March 13, 2015 - link
It was mentioned that they already have that same 17 chassis with a 980M in it, so this is a refresh of a lower tier model.In any machine it is time to put DVD to rest. I am glad to see if there is an optical, that it is Bluray.
That G501 looks like it is going toe-to-toe with the Razer Blade on the lower end. I'd love to see those next to each other.
I'd be interested in either 15.6" model. I have had a 15 before and like the screen size between too big and too small.
dragonsqrrl - Friday, March 13, 2015 - link
I actually prefer to see the 960m in the thin form factor of the G501. Likely far superior battery life under dGPU load, better thermals and noise levels, and a lower price when compared to the 14" Blade. The Blade seems to be really pushing the thermal envelope of its form factor with x70m class GPUs, this seems more practical and balanced. You're not going to be gaming at 4K or 3K anyway, especially not with newer AAA titles, and the native resolution allows for 1/4 1080p scaling, which should be the target gaming resolution on the G501.The only thing I don't like about it is the x4 SSD... what? Why not just a single SSD with a HDD for storage and pass on the cost savings? Otherwise a great looking notebook.
dragonsqrrl - Friday, March 13, 2015 - link
Edit: PCIe x4 SSD, not x4 SSD's... misread. Still wish it came with an additional HDD for storage, I would think they should have the space for a 2.5" drive in a 15" notebook... but maybe that's why it has a 96 Wh battery.fokka - Monday, March 16, 2015 - link
no dvd and hdd is definitely the reason they are able to fit such a massive battery inside. a 2.5"-bay would be nice though, you're right.NatePo717 - Friday, March 13, 2015 - link
The 501 with a 970 at the same price makes a fantastic competitor to the RazerBlade. 960M is just too little for the price. I don't understand pairing really high res screens with GPUs that can't handle them natively. Why use 4 drives? That's just two more points for potential drive failure. 2 x 256 makes more sense to me.I'm looking forward to reviews of the Gigabyte P35Xv3-CF7
mmrezaie - Friday, March 13, 2015 - link
thats PCIE x4 not four drivesingwe - Friday, March 13, 2015 - link
I agree that the 960m is too little.I don't think there are four hard drives. I believe that is PCIe x4 meaning four lanes.
NatePo717 - Monday, March 16, 2015 - link
Oh, my mistake!jt122333221 - Friday, March 13, 2015 - link
I'm regretting picking up that Y50 a couple of months ago... Should have held out for the G501. Oh wellAntronman - Monday, March 23, 2015 - link
Ouch. That display on the Y50.redmonkeyjunkie - Friday, March 13, 2015 - link
Seems like the high-end, well designed laptop is dying out at every manufacturer except for Apple. Why are OEMs neglecting that market so bad.WereCatf - Friday, March 13, 2015 - link
What's so bad about these? And how, exactly, do Apple's laptops make for gaming-laptops better than these?extide - Saturday, March 14, 2015 - link
Your fanboy is showing. There are plenty of great high end non apple laptops.Notmyusualid - Monday, March 16, 2015 - link
Honestly?Then please point me to that Apple gaming laptop, with an i7 39xx / 49xx CPU AND with a 100W+ GPU installed.
I eagerly await your response.
Wolfpup - Thursday, March 19, 2015 - link
Yeah, I don't even consider Apple's notebooks high end. They're all low end or mid range, despite their absurd prices.Their *current* $2500 notebook has less than 1/3 the power of my *2012* Alienware that I bought for $1700! Plus my system has user replaceable drives, RAM, way more ports, WAAAAAAAAAAY better cooling, etc.
Heck, while my current notebook is powerful enough, I still drool over this year's gear, but $2500 with Apple doesn't buy you 2012's gear!
WereCatf - Friday, March 13, 2015 - link
I've been eyeing buying a G751JY, but I don't really see the point in including a damn VGA-port in a high-end gaming laptop. And an optical drive bay is also kind of redundant, that space could be much better used for, say, additional cooling. (Yes, I know you can remove the drive, but that doesn't remove the bay itself.) Besides, with HEVC slowly gaining traction I wonder if it wouldn't be better to wait for the next NVIDIA GPU to come out which would hopefully have full H/W-accelerated decoding and encoding of it..Nightwolf1 - Friday, March 13, 2015 - link
Where is the wired Ethernet?Why is not just a USB 3,1c and 2 USB 3.0 ?, they always come dragging behind Apple!
Is there full HDMI 2.0 support?
SirKnobsworth - Friday, March 13, 2015 - link
The G501 is too thin for a full-sized ethernet port. Not sure why they didn't include it on the thicker models.hanssonrickard - Sunday, March 15, 2015 - link
Just buy an Ethernet adapter for the thunderbolt port, the way Apple handles it in their Macbook (pro/air). Works fine.Notmyusualid - Monday, March 16, 2015 - link
+1.I can confirm this is correct with USB > Ethernet adapter also.
Notmyusualid - Monday, March 16, 2015 - link
FORTUNATELY for Apple, people like you exist... (and I do respect them for creating such a fan base as they have. Brilliant marketing)I've also noticed a recent trend in denying small chassis laptops a physical ethernet connector - and I'm not saying I like it, but after using one for some months, it is not a problem at all. USB3.0 > Gig Ethernet adapter still has amazing latency, which I definately expected it would not have. Nice to be proven wrong sometimes.
Anyway, don't you have to getting along, and go collect your $17k smartwatch from somewhere?
Wolfpup - Thursday, March 19, 2015 - link
Are you saying having to use a USB Ethernet adapter increases latency?At any rate, I use Ethernet 100% of the time, and would be annoyed if a notebook doesn't have it. Heck, I've already got 5 USB ports + Ethernet, and STILL use a 7 port USB hub...even just the issue of having somewhere to stick it in would be bad...
Earlmid - Tuesday, March 24, 2015 - link
apple still doesn't have usb type c on their macbook pro's, which is basically what this laptop is competing againstHrel - Friday, March 13, 2015 - link
G751 should have a GTX970M in it, or be thin through the whole chassis with the GPU it does have. Glad there's at least one with a 1080p screen. Considering modern mobile GPU's only very recently started becoming able to run games at 1080p I'm really shocked that so many are coming out with above 1080p panels. I'm just over here like "modern GPU's struggle to run games at 1080p even 2 years after release, who the fuck thought it was a good idea to introduce resolutions even higher?!?!"dragonsqrrl - Friday, March 13, 2015 - link
It already comes with 970/980m options."I'm just over here like "modern GPU's struggle to run games at 1080p even 2 years after release, who the fuck thought it was a good idea to introduce resolutions even higher?!?!""
You know, you can run games at lower than native resolution. 4K allows for 1/4 scaling to 1080p. And you can still benefit from the higher resolution for everything else.
Hrel - Saturday, March 14, 2015 - link
Ofcourse you can lower it, but1. It doesn't look so good
2. I don't want to pay for something I'm not gonna use.
What benefit does the higher resolution provide for "everything else"? Absolutely nothing AFAIC. Makes shit so small you can't read it, fucks up the OS because the DPI is too high. Fucks up every program on the computer because the DPI is too high.
Have you even actually used a 4k screen?
dragonsqrrl - Saturday, March 14, 2015 - link
"It doesn't look so good"It shouldn't look any worse than a native 1080p display. That was the point of my previous comment.
"I don't want to pay for something I'm not gonna use."
Then you shouldn't get one, especially you're that certain you won't benefit from it. Although I'm sure you're familiar with the general trend towards higher DPI display's. You're going to have less and less of a choice going forward, particularly with notebooks.
"Have you even actually used a 4k screen?"
Yes, although admittedly I haven't used a Windows machine with quite that high of a DPI. I've used a 24" 4K monitor and I've tried a Surface Pro a few times. I'm well aware of UI scaling issues throughout Windows 8, and web browsers, and content creation applications, and a lot of other things. I totally agree that it's not ready for prime time. But when scaling works, and when viewing native content at 4K, it looks amazing. I think it can offer a lot of additional fidelity over standard DPI displays, particularly for content consumption on desktop.
Notmyusualid - Monday, March 16, 2015 - link
Yep - QHD 28xx x 1400 is what I want in a gaming machine. Please nothing higher. The poor GPUs...4k on a laptop is very much a mixed viewing bag, I can attest. But yes, as the poster below states, when it works, its lovely.
Wolfpup - Thursday, March 19, 2015 - link
In THEORY it seems like if you're 4x 1080p that should be okay (although 1080p is good on a 24" monitor, too high on a 17" one IMO).But yeah, I don't see the benefit. On a tablet there is only because I'm holding it right up to my face. On a monitor, I'm at least a few feet away, and I think the quality of the panel would make a lot bigger difference than quadrupling the resolution... And stuff's already small enough...
MojaMonkey - Friday, March 13, 2015 - link
I assume it's Thunderbolt 1 port. But seems weird since thunderbolt 2 has been out since 2013.dragonsqrrl - Sunday, March 15, 2015 - link
Then why would you assume it's a Thunderbolt 1 port?arnavvdesai - Friday, March 13, 2015 - link
I like the 501 a lot, I believe the price is right for the thinness. I am a developer & do not really care about the fastest GPU but do care about a faster SSD & good RAM (16 is plentiful) but I would have preferred like another commentator had mentioned to have had a slot to put a 2.5" SSD in addition to the PCIe x4 SSD drive.SirKnobsworth - Saturday, March 14, 2015 - link
Can anyone confirm that these actually have Thunderbolt? I'm seeing it mentioned in multiple articles but nowhere on Asus's press releases and the logo next to the port is DisplayPort, not Thunderbolt.Brett Howse - Sunday, March 15, 2015 - link
I'll ask ASUS I see the same thing in the images.Brett Howse - Monday, March 16, 2015 - link
US Market will have Thunderbolt, and optional in other markets.SirKnobsworth - Monday, March 16, 2015 - link
Thanks for the clarification.darkich - Saturday, March 14, 2015 - link
That G501 sounds like about the best laptop money can buy.With a touch screen, wacom digitizer, (and, maybe but not necessarily, a yoga-like flip) , there we'd have a perfect machine.
jabber - Saturday, March 14, 2015 - link
Great to see AMD chips and GP.....oh!As expected really.
bloc - Saturday, March 14, 2015 - link
Hey Asus,Get rid of the number pad. What gamer is playing a game called excel?
If i'm paying $2000 for a laptop, I better be sitting dead center of the screen and not 15 degrees off to the left. The G and H should be located in the middle of the keyboard.
Know who finally realized this? The New Alienware 15". They got rid of the numpad.
Notmyusualid - Monday, March 16, 2015 - link
Yep, drop number pad for me too. Sitting off-center is not great...Wolfpup - Thursday, March 19, 2015 - link
I want the number pad. I use them a lot, and find it annoying when a keyboard doesn't have one.Of course most of the time I'm using an external (mechanical) keyboard + monitor anyway, but still.
Antronman - Monday, March 23, 2015 - link
And how am I supposed to play Gmod without a numpad?Tams80 - Sunday, March 15, 2015 - link
It's good to see a laptop with a decent battery capacity, and a decent GPU. I use my laptop for both travelling using the iGPU, and gaming when plugged into a wall socket. Having a large battery capacity also comes in helpful when you want to game, but don't have access to a wall socket (3 or so hours is better than 1).My Ativ Book 8 will do for now, but if AMD release something to compete with Nvidia; then I would be interested to see if ASUS offer an AMD version.
jabber - Sunday, March 15, 2015 - link
I think you'll be waiting a long time unfortunately. AMD are slowly getting dropped or frozen out of most markets.dragonsqrrl - Sunday, March 15, 2015 - link
AMD just doesn't have competitive mobile dGPU's due to the simple fact that their dGPU's in general aren't competitive in terms of performance per W, and I doubt this will change much with the upcoming 300 series.Notmyusualid - Monday, March 16, 2015 - link
I think it is because we've seen only one new architecture from AMD in their high end mobile for years (early 2012?), and I don't see a Windows system with it.As far as I can see, it goes something like this;
Desktop 7870 (Pitcairn?), lower clocks, becomes 7970M.
7970M, clock boost added for some temporary higher clocks, becomes 8970M.
M290X is just a rebrand of the HD 8970M with identical performance. (God I hate that rebranding)
The M295X was 'Tonga', but have you seen it anywhere, for sale? (outside of macs) I haven't. And the performance is below GTX680M as far as I can find.
Too busy with their consoles me thinks.
dragonsqrrl - Monday, March 16, 2015 - link
Actually the M295X is quite powerful. It's not quite as fast as a 970M, but it gets within 10% on average. But ya, it's iMac only at the moment, and honestly I don't see much of a market for it outside high-end AIO's. The problem is the TDP needed to achieve that performance. At ~125W I don't think AMD is even targeting notebooks with this. I'm not sure it's a practical option for even high-end desktop replacements, especially when the 970M has ~80W TDP and performs better.Really AMD's highest-end notebook dGPU is still the M290X (Pitcairn), and that's completely noncompetitive at this point. The 780M outperforms it...
Wolfpup - Thursday, March 19, 2015 - link
I think the reality is AMD can't get their act together with drivers, and inexplicably that's FIIIINALLY catching up to them. I swear, 10 years ago the situation was probably even worse, yet people would claim you were a fanboy for pointing it out.I'm mad that we still have to even talk about drivers. Optimus shouldn't exist. It works better than I'd expect, but still causes issues and just shouldn't be there. GPUS on CPUs are still a terrible idea-wasted die space. ALL PCs should get video driver updates from AMD or Nvidia directly...you shouldn't have to check before you buy a system whether it can. (Dell screwed up the Alpha mini-desktop by not letting Nvidia support it! On a GAMING system?!?)
I should not have to fight to find a system without Optimus and fight to find a system that runs normal drivers in 2015... Why do companies get all of this so very wrong? Why do companies selling systems explicitly marketed for GAMING often even manage to screw it up?
TheinsanegamerN - Tuesday, March 24, 2015 - link
iGPUs and nvidia's optimus are the only reason gaming laptops have useful battery life, and given how a quad core i7 is not only cheaper, but just as fast as a six core i7 in games, the whole wasted die space thing is a dead argument.I agree with you on the driver thing, but that is the OEMs fault, not nvidias. Thus the reason I wont buy an alienware, as tempting as the 13 and 15 are.
fokka - Monday, March 16, 2015 - link
i'm with the other commentors, the g501 should at least come with an optional gtx 965 or 970. that said, i don't see how a 4k screen makes sense if you can't drive games on native resolution at all on it. otherwise it seems to be a nice machine. if only they got rid of that fugly logo on the lid.bramcasey - Thursday, March 26, 2015 - link
Hey guys!I bought the GL551JW without the SSD. Just the 1 TB HDD.
Will i be able to add a SDD bought separately? I was hoping to buy a SSD for cheaper with more space. I'm just worried that if you don't buy it with the SSD they don't leave you a slot on the laptop.
Thanks!
skobbick - Wednesday, April 1, 2015 - link
Anyone knows if the g501 will be available with a full hd screen too instead of the ultra hd?cajun_azn - Thursday, April 2, 2015 - link
I am very curious about the comparison between GTX 960m 4 GB (on the Asus ROG G501jw) vs. GTX 970m 3 Gb (Razer Blade 14 2015). On paper and all else equal, the G501jw beats the Razer Blade 14 on many features: higher resolution + matte screen, more connectivity, better keyboard layout (need the right-click button for Excel spreadsheets). Really looking forward to this review testing on the quality of the screen, especially contrast ratio and color accuracy.kpskps - Tuesday, April 7, 2015 - link
I need a high performance laptop for 3D modelling. Which among the following is best Dell XPS 15 9530, Dell Precision M6800 or Asus G751regards KPS