We've got a Z10, well Anand does but it might get handed off to me at some point, as usual battery life testing and such takes a while. It's in the pipeline for sure though :)
Hi guys.. You are awesome..incredible focused tech reviews and observations....i would love hearing from you a method to calibrate display on devices...i own a nexus 7 and a nexus 4 and the screens are not calibrated at all...especially the panel on the nexus 4 is a high-end one..but without calibration colors are washed out..it's frustating..can you please talk about this topic in your podcast? Or even better can you write an article regarding this issue? I think a lot of people would appreciate that..and that would be a good method to make people realize how important is this thing and you are the only reviewers who highlight this topic when analysing the display on any kind of device
This is a big deal that I spent a lot of 2011 dealing with. The problem is that unlike Windows or OS X, there's no easy way to load a LUT onto the device in a standard, OS-unified fashion. There are some things you can do per platform (SoC / OEM) to make this work, for example Francois has been tinkering with display calibration on Android with some apps that require root: https://play.google.com/store/apps/developer?id=su... and there are a few other similar methods, but it depends on the platform and generally is a hack and iterative process (tweak a file, measure, what changed, iterate).
So it's possible, it's nowhere near mature however. I think the ultimate goal is a display colorimeter attached over USB-OTG that will enable end users to do this kind of calibration, since this is a real issue, and OEMs don't want to incur the dollar or so extra on BOM to get it calibrated.
For the Nexus 4 the colours have been completely hacked. You can change every possible setting to reach a perfect 2.2 image. With an app this worked on stock before, but on 4.2.2 google closed some settings (for whatever stupid reason). The kernel and mod makers have released versions that allow changing of gamma etc... The best current solution is using the faux123 kernel and his settings app... You can find the kernel here: http://forum.xda-developers.com/showthread.php?t=2...
Here are his two apps to accompany his kernel, one for changing the screen settings and one for everything else (undervolting, overclocking, how many cores etc etc.):
I'm certain that there is plenty in the pipeline but there is plenty of discuss now. AMD's GPU roadmap snafu, Intel killing off Itanium (and Poulson hasn't gotten any coverage around here at all), rumors of consoles from MS and Sony blocking used games etc.
While not on the podcast, I'm also kinda surprised that you didn't realized that Ivy Bridge Core i3's only support PCI-e 2.0 officially. Much like Turbo and VT-d, Intel is using PCI-e bandwidth as a distinguishing feature. Similarly not all socket 1155 Ivy Bridge chips support 8x + 4x + 4x lane configurations for connecting three devices to the socket.
Oh wow. Drive bandwidth usage of consumers? That it the absolutely last thing major ISP's in the US want. ISP's are currently happy as a government sanctioned monopolies in various markets where they can continually charge more money for less service. They want to suppress bandwidth usage so that they don't have to upgrade their infrastructure. Movement in this area is only done after necessity as they first have to beg their share holders that they actually have to do something to make money instead of exploiting their existing customers.
Having said that, I live in a suburb of KC and patiently waiting for Google to roll out fiber in my area so I can ditch my cable company. My spite towards my ISP is at the level I'd be willing to pay out of my pocket for several of my neighbors to make the jump as well.
As for what Intel could get out of making media deals is possibly a move to provide content to x86 smart phones. If Intel can't get their handsets accepted by carriers, why not go the virtual carrier route and carry along the media they already have licensed? Get an Intel phone, sign up for their phone service and get their IPTV service as a bonus. Apple faced such an issue in 2006 before deciding to go with AT&T as a carrier.
I only noticed my i3-3225 was PCIe 2.0 only fairly recently (I queried Anand over twitter), but it still does 8x+4x+4x on the PCIe device side, even with 3xGPUs. Which i3s do not support 8x+4x+4x may I ask?
It stands to reason that Intel are partitioning more from low to high end. If you want a single high end feature, you have to buy from the top range of SKUs. If you just want it to go, a low end part is good enough. Basic SIs selling to the generic home user won't care as much on PCIe 3.0 or VT-d, but the home builder or niche SI delivering to a market segment would.
Ian
PS. I'm green with envy on your proximity to Google Fiber. BT Infinity 3, while the top BT package in London, doesn't even come close. Tweet us your speedtest.net results :)
It looks like I'll have to correct myself a bit. I was browsing ARK and the mobile i3's don't support the 8x+4x+4x configuration. The desktop models do.
I'm hope for Google Fiber by the end of the year so it'll be a bit of a wait. The impressive thing is that various SpeedTests are having issues saturating the connection. I also offered Cyrus over at Arstechnica a hardware loan when he visited Homes4Hackers a few months ago. Turns out that you can't run a local server according to Google's ToS. Regardless one of my stress tests will be seeing how many people I can host on a MineCraft server. :)
+1 for getting a Tesla based on GK110, and have a Tahiti Vs. GK110 compute shootout. I am sure Asus would be more than willing to lend a card or two for publicity... :D
Either in a reply or sometime in a future podcast can you all offer your opinion on Ivy Bridge - E vs Haswell? I am looking to upgrade to what would be considered a workhorse build, since I'm running an i7-930 currently. I'm running Adobe products constantly, and when I look on Newegg now, the i7 3820 is $10 more than the i7 3770. Currently it would seem like you would get much more value with the 3820, and since I'm going to continue to need a video card for the foreseeable future I don't see a reason to get on chip graphics. Its a workstation basically, so I'm not really concerned with power. Is there any reason to go Haswell over Ivy Bridge - E except that Haswell might be launching sooner?
I would also like to hear feedback from AnandTech. Presumably the primary tradeoff is better memory bandwidth in the 3820 (up to 51.2 GB/s via quad channel memory) versus Haswell's processing improvements.
That's not at all true. There can be a noticeable difference in performance thanks to the improved memory bandwidth, but it's usually limited to specific applications that actually hit memory more than most. Server and workstation applications can sometimes benefit, but most consumer applications are fine with dual-channel. I believe Ian even has some stuff that he uses for his other job (scientific research of some form I think) that benefits from quad-channel.
I was looking at a few graphs on overclock.com that compared the extreme LGA 2011 versions against the 3770K (stock speed) and concluded a noticeable decrease in time for video encode. I've had to encode 30 minute videos before and probably will in the future so I'm very interested in the memory bandwidth argument. If the answer is only that 6 cores is better than 4 cores for that workload then ok, I'll have to see if a thousand dollar processor is worth the added cost - which it might be. But if the added cache and what I believe is better throughput makes a difference, then I'd like to know if there's a better value to be had from a product only $10 more than the 3770.
I got my six core Core i7 3930k for $400 from MicroCenter when they were running a sale. If the applications scale from 4 to 6 cores, then the price/performance ratio is swings in favor of the 3930k for the price I paid.
Nothing not mention is that the 3770k has Quick Sync and using an accelerated encoder, it could out run the 3930k even with its extra cores. It is all about picking the best tool for the job and it may not be that clear.
Video production is an interesting example because it requires a balanced system where processor, memory, storage, and gpu work together. It is not uncommon to see bottlenecks shift from one component to the other depending on application load. I concede that many hardware configurations have excess memory capacity, but I am not sure how much excess bandwidth is available. Please provide a link with more information regarding practical benefits/constraints of memory bandwidth, especially as it applies to running Adobe Creative Suite products.
I doubt Anandtech will be able to confidently verify the TIM until they have a retail unit in their labs. We'll likely hear from enthusiasts that delid their CPUs long before that.
You were really trying to oversell Intel TV to us there, Anand. Good thing Brian was there to push some reality into the discussion.
I don't know why the media is even writing so much about Intel TV. Just because it's Intel and they said they want to enter the market? And...?
I haven't heard anything interesting from Intel that made me think "game-changer" in the TV space. If anything I expect Intel to be LESS disruptive than more disruptive to the cable companies, because they have nothing to offer really, unlike Google and Apple who at least have other content. So my guess is they will get desperate and accept whatever deal the cable companies want.
Also, I don't see people buying their $300+ kit, which is probably what's going to cost, the set top box and the camera.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
27 Comments
Back to Article
Fiercé - Sunday, February 17, 2013 - link
Just curious if the new BlackBerry phone is one of the items you're reviewing and are under embargo for?Or will Mr. Klug only tackle that when it's available in the U.S.?
Won't consider purchasing it until I read a review here.
Brian Klug - Sunday, February 17, 2013 - link
We've got a Z10, well Anand does but it might get handed off to me at some point, as usual battery life testing and such takes a while. It's in the pipeline for sure though :)-Brian
tipoo - Monday, February 18, 2013 - link
Does anything under embargo start with PS and end with 4? :PHybridtechz - Sunday, February 17, 2013 - link
Hi guys.. You are awesome..incredible focused tech reviews and observations....i would love hearing from you a method to calibrate display on devices...i own a nexus 7 and a nexus 4 and the screens are not calibrated at all...especially the panel on the nexus 4 is a high-end one..but without calibration colors are washed out..it's frustating..can you please talk about this topic in your podcast? Or even better can you write an article regarding this issue? I think a lot of people would appreciate that..and that would be a good method to make people realize how important is this thing and you are the only reviewers who highlight this topic when analysing the display on any kind of deviceBrian Klug - Sunday, February 17, 2013 - link
This is a big deal that I spent a lot of 2011 dealing with. The problem is that unlike Windows or OS X, there's no easy way to load a LUT onto the device in a standard, OS-unified fashion. There are some things you can do per platform (SoC / OEM) to make this work, for example Francois has been tinkering with display calibration on Android with some apps that require root: https://play.google.com/store/apps/developer?id=su... and there are a few other similar methods, but it depends on the platform and generally is a hack and iterative process (tweak a file, measure, what changed, iterate).So it's possible, it's nowhere near mature however. I think the ultimate goal is a display colorimeter attached over USB-OTG that will enable end users to do this kind of calibration, since this is a real issue, and OEMs don't want to incur the dollar or so extra on BOM to get it calibrated.
-Brian
Hybridtechz - Monday, February 18, 2013 - link
Thanks brian...i beg you to notify readers when something like that will happen....for now, i'll take a look at the app you linkedvariety - Thursday, February 21, 2013 - link
For the Nexus 4 the colours have been completely hacked. You can change every possible setting to reach a perfect 2.2 image. With an app this worked on stock before, but on 4.2.2 google closed some settings (for whatever stupid reason). The kernel and mod makers have released versions that allow changing of gamma etc... The best current solution is using the faux123 kernel and his settings app... You can find the kernel here: http://forum.xda-developers.com/showthread.php?t=2...Here are his two apps to accompany his kernel, one for changing the screen settings and one for everything else (undervolting, overclocking, how many cores etc etc.):
https://play.google.com/store/apps/developer?id=Pa...
Kevin G - Sunday, February 17, 2013 - link
I'm certain that there is plenty in the pipeline but there is plenty of discuss now. AMD's GPU roadmap snafu, Intel killing off Itanium (and Poulson hasn't gotten any coverage around here at all), rumors of consoles from MS and Sony blocking used games etc.While not on the podcast, I'm also kinda surprised that you didn't realized that Ivy Bridge Core i3's only support PCI-e 2.0 officially. Much like Turbo and VT-d, Intel is using PCI-e bandwidth as a distinguishing feature. Similarly not all socket 1155 Ivy Bridge chips support 8x + 4x + 4x lane configurations for connecting three devices to the socket.
Kevin G - Sunday, February 17, 2013 - link
Oh wow. Drive bandwidth usage of consumers? That it the absolutely last thing major ISP's in the US want. ISP's are currently happy as a government sanctioned monopolies in various markets where they can continually charge more money for less service. They want to suppress bandwidth usage so that they don't have to upgrade their infrastructure. Movement in this area is only done after necessity as they first have to beg their share holders that they actually have to do something to make money instead of exploiting their existing customers.Having said that, I live in a suburb of KC and patiently waiting for Google to roll out fiber in my area so I can ditch my cable company. My spite towards my ISP is at the level I'd be willing to pay out of my pocket for several of my neighbors to make the jump as well.
As for what Intel could get out of making media deals is possibly a move to provide content to x86 smart phones. If Intel can't get their handsets accepted by carriers, why not go the virtual carrier route and carry along the media they already have licensed? Get an Intel phone, sign up for their phone service and get their IPTV service as a bonus. Apple faced such an issue in 2006 before deciding to go with AT&T as a carrier.
IanCutress - Monday, February 18, 2013 - link
I only noticed my i3-3225 was PCIe 2.0 only fairly recently (I queried Anand over twitter), but it still does 8x+4x+4x on the PCIe device side, even with 3xGPUs. Which i3s do not support 8x+4x+4x may I ask?It stands to reason that Intel are partitioning more from low to high end. If you want a single high end feature, you have to buy from the top range of SKUs. If you just want it to go, a low end part is good enough. Basic SIs selling to the generic home user won't care as much on PCIe 3.0 or VT-d, but the home builder or niche SI delivering to a market segment would.
Ian
PS. I'm green with envy on your proximity to Google Fiber. BT Infinity 3, while the top BT package in London, doesn't even come close. Tweet us your speedtest.net results :)
Kevin G - Monday, February 18, 2013 - link
It looks like I'll have to correct myself a bit. I was browsing ARK and the mobile i3's don't support the 8x+4x+4x configuration. The desktop models do.I'm hope for Google Fiber by the end of the year so it'll be a bit of a wait. The impressive thing is that various SpeedTests are having issues saturating the connection. I also offered Cyrus over at Arstechnica a hardware loan when he visited Homes4Hackers a few months ago. Turns out that you can't run a local server according to Google's ToS. Regardless one of my stress tests will be seeing how many people I can host on a MineCraft server. :)
tynopik - Sunday, February 17, 2013 - link
wrong
that is the core of the objection, there's just no way they can get enough content to make it compelling
simply not possible
right
also,
Please, please, please get a Tesla to test, it would be awesome!
mayankleoboy1 - Sunday, February 17, 2013 - link
+1 for getting a Tesla based on GK110, and have a Tahiti Vs. GK110 compute shootout. I am sure Asus would be more than willing to lend a card or two for publicity... :DIanCutress - Monday, February 18, 2013 - link
If you have suggestions of Compute benchmarks, please email us and let us know :)Ian
silenceisgolden - Monday, February 18, 2013 - link
Either in a reply or sometime in a future podcast can you all offer your opinion on Ivy Bridge - E vs Haswell? I am looking to upgrade to what would be considered a workhorse build, since I'm running an i7-930 currently. I'm running Adobe products constantly, and when I look on Newegg now, the i7 3820 is $10 more than the i7 3770. Currently it would seem like you would get much more value with the 3820, and since I'm going to continue to need a video card for the foreseeable future I don't see a reason to get on chip graphics. Its a workstation basically, so I'm not really concerned with power. Is there any reason to go Haswell over Ivy Bridge - E except that Haswell might be launching sooner?tynopik - Monday, February 18, 2013 - link
the answer to this is simplethere are very few situations where IVY-E makes sense
1) need more than 4 cores
2) need more than 32GB
3) really really rare cases that need more PCI-E bandwidth
that's it
noblemo - Monday, February 18, 2013 - link
I would also like to hear feedback from AnandTech. Presumably the primary tradeoff is better memory bandwidth in the 3820 (up to 51.2 GB/s via quad channel memory) versus Haswell's processing improvements.tynopik - Monday, February 18, 2013 - link
nothe difference in memory bandwidth is unnoticeable in the real world
JarredWalton - Monday, February 18, 2013 - link
That's not at all true. There can be a noticeable difference in performance thanks to the improved memory bandwidth, but it's usually limited to specific applications that actually hit memory more than most. Server and workstation applications can sometimes benefit, but most consumer applications are fine with dual-channel. I believe Ian even has some stuff that he uses for his other job (scientific research of some form I think) that benefits from quad-channel.silenceisgolden - Monday, February 18, 2013 - link
I was looking at a few graphs on overclock.com that compared the extreme LGA 2011 versions against the 3770K (stock speed) and concluded a noticeable decrease in time for video encode. I've had to encode 30 minute videos before and probably will in the future so I'm very interested in the memory bandwidth argument. If the answer is only that 6 cores is better than 4 cores for that workload then ok, I'll have to see if a thousand dollar processor is worth the added cost - which it might be. But if the added cache and what I believe is better throughput makes a difference, then I'd like to know if there's a better value to be had from a product only $10 more than the 3770.Kevin G - Monday, February 18, 2013 - link
I got my six core Core i7 3930k for $400 from MicroCenter when they were running a sale. If the applications scale from 4 to 6 cores, then the price/performance ratio is swings in favor of the 3930k for the price I paid.Nothing not mention is that the 3770k has Quick Sync and using an accelerated encoder, it could out run the 3930k even with its extra cores. It is all about picking the best tool for the job and it may not be that clear.
noblemo - Tuesday, February 19, 2013 - link
Video production is an interesting example because it requires a balanced system where processor, memory, storage, and gpu work together. It is not uncommon to see bottlenecks shift from one component to the other depending on application load. I concede that many hardware configurations have excess memory capacity, but I am not sure how much excess bandwidth is available. Please provide a link with more information regarding practical benefits/constraints of memory bandwidth, especially as it applies to running Adobe Creative Suite products.ImSpartacus - Monday, February 18, 2013 - link
I doubt Anandtech will be able to confidently verify the TIM until they have a retail unit in their labs. We'll likely hear from enthusiasts that delid their CPUs long before that.IanCutress - Monday, February 18, 2013 - link
We have labs?!? :DKrysto - Monday, February 18, 2013 - link
You were really trying to oversell Intel TV to us there, Anand. Good thing Brian was there to push some reality into the discussion.I don't know why the media is even writing so much about Intel TV. Just because it's Intel and they said they want to enter the market? And...?
I haven't heard anything interesting from Intel that made me think "game-changer" in the TV space. If anything I expect Intel to be LESS disruptive than more disruptive to the cable companies, because they have nothing to offer really, unlike Google and Apple who at least have other content. So my guess is they will get desperate and accept whatever deal the cable companies want.
Also, I don't see people buying their $300+ kit, which is probably what's going to cost, the set top box and the camera.
cj100570 - Monday, March 4, 2013 - link
Anand, he ya go buddyCar and Driver https://www.youtube.com/watch?v=1kCG-WqpVnI
Motor Trend https://www.youtube.com/watch?v=0DDwwpt-DU4
cj100570 - Monday, March 4, 2013 - link
Oops, here's the Motor Trend review https://www.youtube.com/watch?v=VUg-BZAR5ro