By Alex Daley and Doug Hornig

The development of converged networks is one of the hottest trends in technology today. Packets, packets, it’s all about the packets…

Broadly speaking, what the term means is the unification of all communications and broadcast media – from telephone calls, to television, to the Web – onto a single platform, allowing businesses to roll out new services to customers and end users with no change to the underlying network.

Reducing capital costs while providing the flexibility to allow anything from multiway, real-time, synchronous communications (eight-way Skype video call, anyone?) to cloud storage and asynchronous communication (think visual voicemail and seven seasons of Weeds on Netflix) – that’s the goal, and the challenge.

We have always been a mobile society. Don’t like it where you are? Pull up your roots and plant them someplace else. Not an option? Then just import enough goods to transform the place you find yourself into something resembling where you’d like to be. Over the past couple of centuries, major technological advancement was tied largely to two things: moving people and things more efficiently from place to place, i.e., through railroads, steamships, automobiles, aircraft; and communicating over previously prohibitive physical distances, i.e., via telegraph, telephone, radio, TV, and Internet. It’s the latter that’s of interest here.

Early long-distance communication was accomplished over wires and undersea cables. Radio and TV changed that, though being unidirectional they were limited to the entertainment sphere and to local arenas only. Before the advent of commercial satellites, it was simply not possible to reach more than a small area with a single broadcast. Hence the build-out of cable television networks.

Cable growth was slow but steady, from a few thousand subscribers in the early 1950s to nearly 16 million by the end of the 1970s. That accelerated in the mid-1980s, after the passage of the Cable Act. The result: over 110 million global subscribers (about a quarter of them in the US), just a fraction of the 4 billion televisions in over 2.1 billion households worldwide – the balance using satellite and terrestrial broadcast.

The phone system evolved the opposite way, born hard-wired and remaining so much longer. Many of us can easily remember when “party lines” were common; you needed an operator to call outside your immediate area; phone numbers had only five digits; and Superman could easily find a booth in which to liberate his cape. Phones were a wonderful modern convenience, but expensive, cumbersome, and static-ridden. And there was nothing for the guy or gal on the go.

An enormous mobile communication niche was just sitting there, waiting for someone to fill it.

Cellphone antecedents date back a half-century, but the real deal didn’t arrive until 1997, with the release of the first pocket flip phone. That’s barely fifteen years ago. Since then, there has been a technological explosion unprecedented in human history. Nothing has ever caught on, worldwide, as fast and as furiously.

Motorola eventually sold 60 million of its first clamshell phones. Not bad for a new product. But consider the estimated number of worldwide mobile subscriptions at the end of 2011: 5.9 billion. That’s one phone for every 1.2 human beings on planet Earth, with parity expected no later than 2014 (parity, of course, does not mean that every living person has access to a mobile phone).

The next problem, however, was that each communications medium had been tailored to do only one thing. Telephone systems were built on the principle of circuit switching for much of their history. One wire meant one conversation. Or one slice of wireless spectrum meant one phone call (anyone who had an early cellphone that was cloned illegally and who received big bills for calls never made can remember those days). As demand for services increased, innovation brought ideas like time division, where 4 or 8 or 16 parties could share a circuit without knowing it was happening. But the network was still only a voice network.

Cable was much the same. You could watch television channels, but you couldn’t use the same wire for anything else. New features like pay per view were added, but it was still just a television network.

Then along came the Internet, with technology that was revolutionary in its simplicity. It divided information up into little packets and could ship those packets – containing virtually any kind of data – anywhere. At first the systems were slow, allowing transmission of simple text documents and maybe some small images. But as demand for services exploded – Net users grew from about 50 million in 1996 to an estimated 2.1 billion by the end of 2011 – so did the speed of the networks behind them.

As the “all-digital, all-the-time” world unfolded at roughly the speed of light, innovations tried to keep pace. People began making phone calls on Skype, surfing the Web on their big flat screens, and watching last night’s TV shows on their computer monitors. Increasingly, it was all about the transmission of data packets, whether for research, business, entertainment, market transactions, or whatever.

And, depending on your perspective, as a cable or telephone executive it was either the biggest threat to your business imaginable or the biggest opportunity since the original. Businesses of all kinds responded to the latter, and something incredible began happening. Suddenly, cable-television operators were in the business of selling telephone services. Telephone companies were selling digital television services. And everyone was selling Internet connections. The race was on for traditional suppliers with wired networks to upgrade their equipment, change out their networks, and make them all packetized – fully Internet Protocol (IP) ready.

Yet, for over a decade, one “legacy” provider was too busy to really even notice the Internet. Mobile-phone providers were grappling with the fastest-growing business of all time – three times faster than global Net access growth. Trying to squeeze every last bit out of their growing networks, they invested in highly specialized equipment, much like cable and television operators before them. There was demand, of course, for Web access on the go. So they rigged up connections, referred to as Web Access Protocol or “WAP,” between the Internet and their networks. But the technology was slow and cumbersome.

Customers demanded something better. As the Web, email, Internet video, and the like became more pervasive, more and more users began to look to their mobile-phone providers to give them access.

The answer came in 2007, when Apple introduced the iPhone. It wasn’t the first of its kind – smartphones have been around in one form or another since 1993, when IBM introduced Simon – but it was the first to hit the market at top speed… and it never look back.

Did Steve Jobs envision what would happen next? Maybe. But the public response probably exceeded even his expectations. A few numbers:

13% – the smartphone share of the world mobile market;

78% – the percentage of worldwide mobile data traffic that is consumed by smartphones;

472 million – estimated number of smartphones sold worldwide in 2011;

982 million – worldwide estimated smartphone sales in 2015, a mere eight years after the iPhone’s introduction.

Data speeds on wireless have exploded as well, jumping from the kinds of speeds a dialup connection could have garnered you in 1995 to near broadband today. Now, with LTE and WiMAX, users can connect wirelessly with their phone, laptop, and/or tablet, watch a few Netflix movies, make phone calls, download presentations for the office, and much more. All of this has required a massive investment from wireless companies in retooling and upgrading their networks – more than $100 billion globally per year.

Beyond the popularity explosion and the wildly proliferating number of apps available anywhere, anytime, many people are now envisioning where this is leading. Imagine that you get up in the morning and turn on your phone (at some point we’re going to have to invent a more descriptive term for this amazing device). You connect to your home WiFi system, check your email, see what gold has done overnight, make a Skype call to the Singapore branch of the company, and order flowers sent to one of your colleagues for her birthday. You’re on the phone with your boss when you move outside, and you’re immediately transferred to the cellular network. Your conversation never drops. On the way into the city, your carpool gets caught in a traffic jam and you’re running a little late, which means you’ll miss the start of that 8:30 teleconference with Chicago and L.A. No problem – you just join it by smartphone (via a hands-free headset, of course). Once into your building, you’re automatically switched to the local network there. By the time you reach your desk, the teleconference is proceeding on your office computer and you haven’t missed a word.

We aren’t there yet. But we’re getting very close.

The convergence of all these communications services requires tons of new hardware, plenty of highly specialized software, and a whole heck of a lot of bandwidth, otherwise known as “channel capacity,” which is the maximum throughput of a logical or physical path in a digital communication system. Colloquially, this is known as the “pipe,” through which all things flow. The fatter the pipe, the more you can send down it.

The old dialup-modem download standard of 56 kilobits per second (Kbps) just doesn’t cut it. DSL and cable modems that allowed for download speeds in single-digit megabits per second (Mbps) were an improvement and enabled things like Voice over Internet Protocol (VoIP) to take off, but were simply not ready to handle video. And for serious converged voice/data/video transmission, dozens of megabits all the way up to gigabits per second (Gbps) will have to become the new normal.

On the backbone, to support the wants of this burgeoning user base, speeds must aggregate to much higher numbers if we are not to have the digital equivalent of a twenty-car pileup on the highway, day in and day out. To meet these needs, 10 Gbps pipes were introduced in 2003, and the first 100 Gbps optical transmission technology was shipped to commercial customers last year. One company we follow in Casey Extraordinary Technology is showing bonded channels with 500 Gbps capacity and up.

And, just as the telephone system went wireless with explosive growth and wireless continues to dominate the television landscape, so the future of the converged network is wireless, too. But to get there, we have even further to go speed-wise. While home and business users are now mostly enjoying speeds strong enough to support this converged ideal, mobile is still behind. For example, PC Magtested various smartphones last June in different parts of several cities. Maximum download speed topped out in Dallas, at 14.98 Mbps for a third-generation (3G) service, which is what most people have. Max on a Verizon 4G came in at 37.66 Mbps. Average speeds were, however, much lower – generally in the 1-3 Mbps range – for the 3Gs. This is simply not fast enough to support the full range of applications. However, things are changing quickly.

Most industry observers believe that the future of wireless lies with a technology dubbed LTE, which is shorthand for “3GPP Long Term Evolution for the Universal Mobile Telecommunications System.”

Providers call LTE their fastest, most advanced network. But it’s not just the next generation of wireless; it’s an ongoing, evolving technology, one that will continuously improve over time. It’s poised to become the standard for cellular networks for the next decade, if not beyond. In that same PC Magtest, users of 4G service – which is rolling out in a handful of cities around the US as we write – were getting sustained data speeds up to 15.75 MB, beyond what a cable Internet subscriber in most markets could expect just five years ago (and what many still receive today).

LTE represents a paradigm shift, from hybrid voice + data networks to data-only networks, where voice is handled with the same technology as cable telephony and Skype. Network operators that are deploying it want to replace everything else they have with it. It must be able to handle voice calls and text messaging, as well as Internet services. Trouble is, LTE was designed with data only in mind. So a new VoIP (Voice over Internet Protocol) solution had to be developed.

A couple of different ones were tried, but the one that stuck was VoLTE-IMS (Voice over LTE via IP Multimedia Subsystem), or simply VoLTE. VoLTE supports text messaging and high-quality speech encoding, which will provide clearer calls. It also has the potential to support video calling, but no standard for that exists as yet.

Verizon plans to roll out handsets with VoLTE late this year or early next. AT&T will offer its own version in 2013.

Mobile tech will probably never supplant wire-based data systems for applications that demand the utmost in speed. And even LTE cannot deliver that seamless, all-encompassing experience described earlier.

But it’s taking some intermediate steps. LTE will utilize a technology known as MIMO (multiple in, multiple out), which means that devices have multiple connections to a single cell. That increases the stability of the connection, reduces latency, and increases the total throughput of a connection. MIMO is what allows 802.11n WiFi to reach speeds of 300-400 Mbps, or some ten times that of mobile. Expect mobile to close the gap.

Overall, there are plenty of bugs still to be flushed out as true unified data delivery moves toward reality. Many innovators are working hard on the problems, because the companies that can successfully ride the wave of network convergence – and we hold some of the most promising ones in the CET portfolio – are going to do very well indeed.

Bits & Bytes

“Enemy” App Created for Facebook (The Chronicle of Higher Education)

Dean Terry wants Facebook to be more like real life. So with a lot of help from a couple of students, the college professor created EnemyGraph, an application that allows you to tag any Facebook friend, user of the app, or any page or group on Facebook as an “enemy.” The basic idea behind the app is that people form connections with others based on their mutual dislikes in addition to their mutual likes. And relationships can be enhanced and conversations created when two friends disagree on what they like. Terry is just trying to add this bit of reality to the fairy-tale world that is social media today. In case you are interested, the current top enemies are Rick Santorum, followed by Justin Bieber and Westboro Baptist Church. But keep in mind that the app is relatively unknown still, so sample sizes are very small.

Lie to Me (Futurity)

Computer scientists at the University of Buffalo have developed a new computer lie-detection method that tracks eye movements and correctly detects deceit 82.5% of the time – better than expert human interrogators typically achieve.

Fighting Cancer with Nanoparticles and Magnetism (ScienceDaily)

Scientists at the University of Georgia have used nanoparticles and alternating magnetic fields to kill head and neck cancerous tumor cells in mice in the time it takes Domino’s to deliver a pizza – and without harming healthy cells.

Poo Power (PopSci)

Employees of the Denver Zoo, along with some outside help, have modified a motorized rickshaw to run on animal dung and garbage. And that’s just the beginning. The zoo plans to use the technology to generate power at its upcoming ten-acre elephant exhibit. Eventually, the zoo thinks it will be able to turn 90% of its waste into energy, making use not only of the copious amounts of animal poo it has on hand but also eliminating some 1.5 million pounds of annual garbage waste that previously went into landfills.