Feature

4G – Upping the ante

4G – Upping the ante

By David Gehringer, vice president of marketing, Fanfare Software

By David Gehringer, vice president of marketing, Fanfare Software

The telecoms industry has always been strongly associated with driven, ambitious evolution, none more so than with the constant development and improvement of wireless networks.  This is (and always has been) as a result of extremely high end user demands and expectations, with customers requiring devices to do more and to work faster.

Simply quality

As we currently approach a major junction in the lifecycle of wireless networks, with carriers and manufacturers signifying their commitment by building out long term evolution’ (LTE) and WiMax networks, technologies capable of supporting massively improved data speeds, the overarching issue, the one that will make or break the companies involved, is simply quality. 

Of course, the importance of quality is nothing new, ever since wireless networks started to become ubiquitous in the early 1990s, and carriers and device manufacturers were able to begin to build up their customer bases, there has always been a high level of scrutiny relating to one of three aspects: the build of the mobile device, wireless signal strength and device performance.

 

Build quality transcends markets and is, after all, unambiguous. Being dependent on the (variable) location of the device, rather than the device itself, wireless signal strength can be regarded as a wider issue which in fact affects fewer users as more network antennae are deployed; wireless signal issues are geographically sporadic and rather acute, affecting by far the minority of customers. 

The biggest issue, by some margin, is device performance. 

 

Device performance

This is an ongoing challenge simply because today’s handsets are far more complex and powerful than yesterday’s, and far less than tomorrow’s. However, looking a little deeper into the issue, the real reason why device performance has become such a significant challenge for carriers and device manufacturers is that there is growing expectation from end users for devices to be able to run multiple, feature rich applications, increasingly running concurrently (the Palm Pre’s application switching feature is a prime example of this) and, overall, for them to work, all the time.

If we take a look back at the earliest popular handsets, running on first and second generation wireless networks, they would usually feature (in addition to basic calling and SMS) a calendar, calculator and maybe a basic, standard game. Very little was expected of these handsets when compared with the multi-media capable phones that are now so widespread. 

Only ten years ago, all a user realistically expected of their mobile phone was for it to make, receive and hold calls. Of course, these basic expectations still exist, but the functions of

voice calling and SMS have become so ordinary that they merely represent two in a growing number of applications that users demand to be constantly available.  

An early indication of what was going to be possible with wireless technology came with the advent of WAP and WAP-enabled handsets in the late 90s. Of course, what WAP also serves to demonstrate is what happens when applications do not work; WAP promised a mobile web experience but failed to deliver, initially suffering heavy criticism over performance issues.   

 

Richer experience

Fast forward to today and we have 3G networks that are capable of supporting a far richer mobile web experience, with intensive applications such as streaming music and video, VoIP and social networking sites such as Facebook and Twitter being used on mobiles just as much as standard voice and SMS functions. 

Yet we continue to see instances of carriers and handset manufacturers struggling to deliver. 

From the pressure that O2’s network is under from the strain of free data for iPhone users, to the poor user experience issues that have recently plagued Nokia and RIM; quality persists as a major obstacle to success. Coincidentally, as general internet access and use has also boomed, brands have found themselves subject to much more widespread consumer backlash in the event of these quality issues.

As we head towards 4G and an ever more feature rich mobile environment, where interoperability is a elementary requirement, the bottom line is that handsets need to be tested in conjunction with the specific service provider market to ensure that converged applications work.  Users want applications that rely on the network to function as flawlessly as calls and SMS, regardless of how much more demanding they may be.

This will require a change in testing.

 

Test to standard

Today, with very few exceptions, worldwide carriers do not test handsets. Instead manufacturers merely test handsets and applications against a set of standards. A carrier will validate that its network works for a set of standards by testing applications and calls, but not specific phones. In theory, if the standards are the same, it will all work, but sadly that’s not the case. We have seen in the past that most of the time this does work, but 4G raises the bar inexorably as end users have already begun to demand flawless interoperability and converged applications.

This is the expectation of today’s customers, so how will it be achieved, and who is going to own the testing of specific phones and applications on carrier networks? 

At the end of the day, those who do are going to grab marketshare. 

Fanfare Software helps service providers and network equipment manufacturers deliver quality solutions to market faster and more efficiently. Its automated testing solutions streamline system and device testing, allowing companies to reduce cycles and costs.  www.fanfaresoftware.com