3G Wireless SNAFU
By Dave Mock, Thu May 24 00:00:00 GMT 2001

With numerous problems plaguing 3G rollouts, how will they ever happen? Here's how.


The term SNAFU is said to have originated in the military around the time of World War I. The polite definition of the acronym is given as "Situation Normal; All Fouled Up." Some more creative adjectives have been used in place of "fouled", but they convey the same idea, albeit more emphatically.

At a glance, it seems appropriate to apply it to the whole mess surrounding the development and commercialization of 3G networks around the world. Barely a week goes by now without a network operator admitting to system bugs or testifying that there's no financial motivation to rapidly deploy new services.

But to truly fit the bill of a SNAFU, the current state of mobile disarray would have to be considered a normal situation. Based upon news clips and commentary from the industry, you'd think that it was anything but normal.

As a matter of fact, you may be left with the impression that the current situation is bordering on outright chaos, with the entire industry hanging in the balance where one more small slip could send the wireless empire into complete ruin. At the very least, far too many people have bought the line that 3G networks will likely fail to even operate, forcing hundreds of companies to scrap years of work, file chapter 11 and start over.

In all reality though, this situation is normal. The delays that companies are experiencing are typical, not shocking. The decisions to delay deployment for financial reasons are shrewd, not embarrassing.

While it may be nice to be able to claim the "first to 3G" bragging rights, it's downright foolish to sacrifice customer loyalty and permanently damage your brand by pushing ahead a technology that is not ready for prime time. With a little more considerate thought and perspective on the issues, many would realize that the current events surrounding 3G commercialization are typical and in the best interest of the industry.

Levels of complexity


To give a little perspective to what it takes to commercialize a 3G mobile network, let's look at what's involved. System operators must coordinate the development of several different subsystems such as switches, controllers and handsets with multiple vendors from highly complex specifications.

Once simulated, developed and tested individually, these major subsystem components must then be integrated together in a test environment to make sure they all play nicely with each other. Once through this phase they can be qualified while they move to testing in real world environments on a small scale. After ensuring operation at this stage, they're ready to scale up the network to provide a commercial solution.

At each stage, there are often bugs or other problems uncovered in various areas. No matter how much forethought is given to designs, some problems inevitably hide out until you have the actual, complete system running live. Additionally, each delay from an individual company has the potential to set the entire project back.

Today, wireless networks based upon 2G technologies such as GSM and TDMA have a somewhat simple approach to planning a network to ensure good service to users. Basically, a good approximation of the "user density" in various areas helps to properly locate cells. The capacity of the network is fixed and the various levels of QoS are specified up-front. In this case, the results in the real world operation follow a more predictable path, handling voice calls as they should and dropping them when capacity is exceeded.

With most 3G networks now incorporating some form of CDMA, the loading capacities of cells are now coupled with the coverage provided by individual stations. While network operators may have very good historical data of traffic density in various areas of their established network, a new 3G network cannot simply duplicate the current network topology. From the RF perspective, a 3G network built into an area already covered by GSM needs some more sophisticated planning tools.

For operators that already employ 2G CDMA systems, the migration to higher speed data services also poses problems although they may be different in nature. While the CDMA carrier may have more familiarity with this type of network planning, the additional demand for data capacity and varying priority requirements can quickly lead to problems in adequate coverage if they're not careful in the up-front analysis.

With most of the global wireless networks lacking practical experience in CDMA though, this learning curve in CDMA is starting to become apparent. Look for many more problems and delays in 2002 as the initial trials of fielded networks fall somewhat short of intended capacity and coverage.

Regardless of past experience, one of the biggest challenges for network operators in the years to come will be the shift from circuit-based networks to packet networks. The entire history of mobile communications has centered on voice connectivity and only limited data capabilities. As such, current networks are optimized around circuit-based architectures and all the billing and QoS drivers stem from the need to provide secure voice channels.

Moving from this world to a place where information transmitted on the network is governed by a more complex scheme of varying priorities is enough to give a network planner headaches. Just how do you design a system that can handle significant fluctuation in voice and data users in real time while allocating the appropriate bandwidth for adequate QoS to all users?

Houston, we have a problem


While analogies always lack direct correlation in some areas, they are useful to gain new perspectives on issues. With respect to 3G networks for instance, it may not be far fetched to compare it to another complex system such as the NASA Space Shuttle.

Admittedly, there are probably more differences than similarities between the two. In one system, real people are strapped to two huge boosters that can't be turned off once they're lit, so catastrophic failure is to be avoided at all cost. Launching 3G wireless networks does not pose a similar threat to human life, at least not in any direct, immediate way.

But studying what it takes to launch a space shuttle can reveal many attributes that carry over into other technical endeavors. The shuttle program is a very long, expensive process that takes the effort and collaboration of hundreds of organizations, both private and public. 3G networks are approaching a decade of development in international bodies, and everyone now knows the hundreds of billions that have been devoted towards their operation.

The goal of the shuttle program is to push the limits of technology to set the stage for discovery. Getting to the discovery phase requires a complex system of technologies to be literally "launched" into space. Discovering new freedoms of a mobile lifestyle and wireless capabilities requires a similar launch - a point where the conceptual transforms into the operational.

Having literally hundreds of independent pieces of intelligent hardware and software working together seamlessly in either system is an incredible feat. Both systems must also overcome major technical hurdles at two levels - subsystems and total system. In both cases, the ultimate difficulty does not lie in the operation of individual subsystems but rather in the successful integration of all the separate pieces.

So if there are a number of ways in which they are similar, what can be learned from the shuttle program's long track record? Well, the simple answer is if you were to play it like Vegas and go with the odds, bet on a launch failure.

Up until 1998, roughly 30% of the 93 shuttle launches went off as planned. The vast majority of them were delayed, sometimes several times, for technical problems or weather conditions. When looked at on a flat spreadsheet, many people would call the program a failure simply due to the low success rate for planned launches.

But when looked at in the perspective of what has been accomplished, the program is an unprecedented success. In light of the tragic loss of the Challenger shuttle and its crew, each of those launch delays or cancellations suddenly become reassuring rather than bemoaned. Fortunately, this type of danger doesn't exist in terrestrial cellular systems, but the motivation to get it right before launch is shared.

For certain, there will be many years of 3G network optimization ahead. The initial delays and setbacks in launching 3G networks are only giving companies a glimpse of what it takes to make them operational. Getting them to where they function smoothly will take years, just as it did for GSM and other 2G technologies.

Failure is an option


It probably sounds crazy to think of the fact that some companies purposely set themselves up for failure. While I can't speak directly about the corporate environment at NTT DoCoMo or any other carrier pushing the envelop, I can tell you that frontrunners in the industry don't get where they're at by making every milestone on time.

A company that hits every mark they set is not living up to their potential, period. In the same way, a society, culture, or industry is not progressing if they don't sometimes experience outright failures or at least significant setbacks. The progress of wireless technology thrives off hardship and strife between competitors and standards bodies.

So the next time you read press mocking a company for missing a product or service rollout, think of how the industry would receive the alternative: a watered down, half-baked effort at bringing a new product or service to market behind someone else. The early movers are under the scrutiny of a spotlight because they are leaders. The alternative is to follow and blend in, becoming just another player. To demonstrate this, quickly name three companies that don't put their neck on the line.

Still thinking?

Dave Mock is a freelance writer covering mobile technologies and markets. He's published papers to educate investors in wireless markets that are available through Amazon.com and BarnesandNoble.com. He also speaks at seminars and provides training to corporate clients. His first hardcover book on investing in wireless will be published with McGraw-Hill in Spring 2002.