So I have a confession to make—I’m late on this one. Way late. I managed to catch strep throat, came down with a high fever, then a sinus infection, and as a result missed my goal of having everything Verizon 4G LTE wrapped up and published a few weekends ago. One thing led to the other, and I promised a number of readers both in emails and on Twitter that it would be done a long time before it ended up coming to fruition. I think I’m going to add a week to all time estimations from now on, just to be safe. Apologies if I made you refresh obsessively a few times there.

That said, it isn’t entirely a loss. Over the past month, I somehow have found myself getting slowly buried in literally every single Verizon 4G LTE device (with the exception of the LG VL600 data card) and that’s a good position to be in.

The story of our LTE testing started actually before MWC with the Pantech UML290, and since then each time a new device has shown up, I’ve hopped in my car, driven two hours to Phoenix (the nearest LTE market) and spent a sleepless 48 hours testing battery life, speeds, and stability. It’s been a lot of testing, driving, and collecting data. I’ve recorded 542 speed test runs on 4G LTE as a result, and many more on EVDO for comparison. There’s a ton of stuff to go over, so to keep things manageable, I’ve split the review down the middle. This half is everything about Verizon 4G LTE from a cellular perspective including two data cards and a WiFi hotspot. The other half is just the HTC Thunderbolt.

Introduction to Cellular Network Evolution

Before you dive into our review of the Pantech UML290, Novatel Wireless USB551L, and Samsung SCH-LC11, it’s worth it to have a discussion about what “4G,” and further LTE, really is. To that extent, I think it’s also worth it to take a look back on the evolution of wireless network tech from a historical perspective. It’s usually odd to start a story out this way, but it really does give perspective for how far the mobile network story has come since its inception. Crack open any mobile broadband book, and you’ll read a narrative something like this one.

In the 1G days, all we cared about was enabling some very basic things we take completely for granted now: basic voice telephony (analog), mobile billing, roaming from carrier to carrier. Simple problems like multiplexing and managing handover were tackled, and capacity wasn’t a huge concern because of relatively limited adoption due to price. Then came 2G in the form of CDMAone/IS-95 and GSM, which brought more voice capacity (digital), and very basic data connectivity. As adoption increased, more and more capacity was necessary, prompting 3G planning.

Each of the two camps then formed their own 3G projects for improved data speeds (3GPP for GSM family technologies, 3GPP2 for CDMA family technologies), the results of which were WCDMA and CDMA2000, respectively. The W in WCDMA stood for “wide,” since 3GPP settled on relatively wide 5MHz channels compared to CDMA2000’s 1.25MHz channels. The original suite of “3G” technologies didn’t meet ITU-R goals set at IMT-2000 which put a 3G throughput target at 2Mbps, and both the 3GPP and 3GPP2 camps went back to the drawing board with a new focus on data.

3GPP2’s solution was HRPD (High Rate Packet Data) which we now know as EVDO (EVolution Data Optimized), and 3GPP came up with HSPA (High Speed Packet Access). The big differentiator between the two historically has been that HSPA offered simultaneous voice and data multiplexed on same 5MHz carrier, while the 3GPP2 solution required a separate 1.25MHz CDMA2000-1x carrier for voice. 3GPP2 went on to mitigate the lack of simultaneous voice and data with a VoIP solution in EVDV and SVDO, but it hasn’t seen adoption. The 3GPP camp improved on GPRS data rates with EDGE as well. In modern cellular data networks, HSPA and HRPD (EVDO) have become the dominant 3G players we’re used to seeing.

That’s a hugely oversimplified look at the evolution the most popular two of cellular access technologies have taken, but there’s a fairly obvious trend which emerges. Focus has gradually shifted away from delivering more and more voice capacity, and settled on delivering faster and faster data. Voice’s place in the broader picture is just another service atop a data connection, or on a legacy network technology, at least for the time being.

A Verizon 4G LTE eNodeB—the LTE antennas are the bigger ones on the outside

If you’ve been paying attention at all, chances are that you’re pretty familiar with the data scenario everywhere—3G networks based on tech from both the 3GPP and 3GPP2 camps are strained to capacity. The short term solution is to deploy more and more carriers (channels) and linearly scale capacity, but that requires more and more spectrum. The long term solution is even more spectrally efficient multiplexing schemes, smart antennas, and spatial multiplexing, which offer more efficient use of the same spectrum.

The story of 4G thus far has been unfortunately dominated by semantics surrounding what suite of network tech qualifies as being truly fourth generation. Remember how I mentioned that ITU-R set some guidelines way back for what should be considered the bar for 3G? Back then it was 2Mbps while moving. The ITU-R did a similar thing for 4G, and that original guideline was an optimistic 1Gbps stationary and 100Mbps with mobility. It helps sometimes to have actual goals. The exact quote gives a bit more leeway:

“The goals for the capability of systems beyond IMT-2000 are up to approximately 100Mbps for high mobility such as mobile access and up to approximately 1Gbps for low mobility such as nomadic/local wireless access around the year 2010. These goals are targets for research and investigation and may be further developed in other ITU Recommendations, and may be revised in the light of future studies.”

In October, ITU-R recognized LTE-Advanced and WiMAX equivalent (P802.16m) as true 4G technologies that met the 1Gbps stationary and 100Mbps mobility requirements, in addition to a number of other guidelines. In December, however, ITU-R relented and declared that both LTE and WiMAX (as they’re deployed right now) can be called 4G. However, part of this hedging was one more statement—“[in addition,] evolved 3G technologies providing a substantial level of improvement in performance and capabilities” also qualify to be considered 4G.

This essentially is leeway to allow HSPA+ which offers some of the same evolutionary enhancements and features such as higher order modulation, MIMO, and multicarrier to also qualify as 4G. Without them, I think it’s fair to argue that it isn’t really quite the same level of advancement.

In reality, the ITU doesn’t have any ability to police what marketers or carriers bill as 4G. Heck, they could start calling things 5G or 6G tomorrow. One friend I have sarcastically has his N900 set to show “6G” when connected to 3G. But ultimately ITU should be considered an authority nonetheless for setting the bar somewhere.

LTE Network Tech Explained
Comments Locked

32 Comments

View All Comments

  • Brian Klug - Thursday, April 28, 2011 - link

    I mentioned that with LTE sometimes the handovers pause the data context while the handover happens. It's an occasional 50-500ms pause, sometimes a second. Honestly I noticed it more on the data cards than I did the thunderbolt or the Samsung hotspot.

    That's another thing which will improve with time.

    -Brian
  • iwod - Wednesday, April 27, 2011 - link

    I dont think Bandwidth was much of a concern for mature 3G market. Even 1Mbps is good ( enough ) for web surfing. The problem is latency. And it is very high for 3G network,sometimes up to 1sec.

    LTE was suppose bring round trip performance down to double digit ms range. But my skip through of this article sees no test on Latency.

    Another growing concern for me, is that Data and Mobile Network just dont seems to work. You have a finite amount of total bandwidth, but people consume data far greater then anyone would expect. I think someday we have to deploy national wide Micro WiFi + LTE station to help with bandwidth. Especially in populated city. ( I cant even imagine how would it even work out in place like Hong Kong and China )
  • Brian Klug - Thursday, April 28, 2011 - link

    We tested latency on Page 10 if you're interested. Both latency as measured by speedtest.net (which isn't perfect) and by using pingplotter for almost 12 hours to a number of targets.

    It's sub 100 ms for a lot of things, and I showed gaming at 50ms to a local CS:S server. It's a definite improvement again thanks to much faster signaling and a shorter frame time.

    -Brian
  • DanNeely - Thursday, April 28, 2011 - link

    Unless I'm misunderstanding what the graph is showing, ATT's lower C block ownership is fragmentary with no coverage at all in large parts of the country.

    http://www.phonescoop.com/articles/article.php?a=1...
  • DanNeely - Thursday, April 28, 2011 - link

    nevermind, I misunderstood what you were saying....
  • bman212121 - Sunday, May 1, 2011 - link

    I've seen another report from someone using LTE in New Orleans showing similar numbers. Anything sub 100ms should be fine for an fps. I've definitely seen worse under normal circumstances. FWIW using a D2 and comparing the ping times from the phones terminal to a pc using 3G hotspot, the wireless added 16ms latency.
  • bman212121 - Sunday, May 1, 2011 - link

    I have to wonder if they didn't include USB tethering simply because they couldn't sustain the power needed. If you were having issues with a 700ma charger than the maximum 500ma from a computer's usb port could be problematic. It is interesting though that the other devices worked, so I'm guessing that the wifi is what is really eating battery life.
  • tjk818 - Wednesday, July 27, 2011 - link

    I have the Pantech UML 290 and a cradlepoint router all updated with the latest firmware (4glte and 3g)works great on 3G now converting to 4g LTE using a ZADACOM feed cut for verizon746-806mhz and a grid antenna( Hyperlink ) . Without the grid I get 1 bar constant sometimes gong to 2 bars with the GRID I get nothing,
    Does the cable in the Pantech modem need to be connected or disconnected for it work on the grid , I live about 3 miles from the tower . also is there a setting that i can use in the VZAM menu ( under the DIAGVZW menu) that I can set the modem 4g port to activate the external antenna port and deactivate the internal antenna ? I’m using a specan I can see the carriers from the tower at 783mhz.

    feed back is welcome
  • milan03 - Monday, August 22, 2011 - link

    Hey Brian: you've mentioned that current Verizon LTE devices are category 3 meaning they can only achieve up to 50mbps with 2x10Mhz. Are you sure that's the breakdown because I'm seeing 50+mbps on a daily basis here in NYC and when downloading sustained well seeded torrent I'm seeing around 6MB/s which makes no sense. I am convinced that Thunderbolt is capable of 73mbps with all the overhead up to about 60mbps. Am I wrong? I dod have poor upload speeds which explains Thunderbolt being 2x1 MIMO not 2x2 like other devices, but is there any other LTE handset that's 2x2 MIMO?

    Here is what I'm seeing these days: [IMG]http://i51.tinypic.com/dhe1rd.png[/IMG]
  • oz973 - Tuesday, January 17, 2012 - link

    How long does it take for this to charge to 100%? And how can you tell?

Log in

Don't have an account? Sign up now