Continued from an earlier post – RIL’s 4G push and the disruptions that could happen there-off. The first disruption was the ability to create data hotspots with high speed WiFi through 4G Backhaul.
Disruption 2: Triple play services across voice#, data, media, education and broadcast segments
The second disruption, is in terms of triple play services – and the game really looks close to the broadcast services that I had written about some time back. To that extent, RIL would look to cut into revenue space of cable companies. In one of my earlier posts, I had mentioned about how Telco operators could up-end both cable distribution and content management and RIL seems to be planning content exclusivity along with data-pipe ownership. (Now that is something even Apple hasn’t perfected). The triple play strategy of voice, data and video could be very effective in acquiring wallet share of the customer. By exploring the television and cable space, RIL was looking to tap another potential 100 million screens for its broadband services and drive the triple play offering.
RIL as a broadcaster is geared up for media convergence and will deliver content to any new platform. RIL has been creating a walled garden for content, which will give new opportunities to VAS (value-added service) providers and open up space for new niche channels.
Disruption3: The price factor
With an eye on the larger pie, Reliance would bet on building data habit into Indian customer psyche mainstream. There are rumours about Rs.3500 tablets and data plans priced around Rs10 per gigabyte. With a mighty 4G backhaul, powered by 802.11n WiFi and data speeds upto 100mbps-1Gbps and Rs.10/GB cost of data, the traffic rates are bound to multiply. RIL is betting on this traffic surge to build its big push in Telecom.
#: Assuming that RIL will find a way to muscle in VoIP regulation for Inland calls also. RIL does have that clout to make that possible.
I look for hot spots in the data, an outbreak of activity that I need to understand. It’s something you can only do with Big Data.” – Jon Kleinberg, a professor at Cornell
Researchers have found a spike in Google search requests for terms like “flu symptoms” and “flu treatments” a couple of weeks before there is an increase in flu patients coming to hospital emergency rooms in a region (and emergency room reports usually lag behind visits by two weeks or so).Global Pulse, a new initiative by the United Nations, wants to leverage Big Data for global development. The group will conduct so-called sentiment analysis of messages in social networks and text messages – using natural-language deciphering software – to help predict job losses, spending reductions or disease outbreaks in a given region. The goal is to use digital early-warning signals to guide assistance programs in advance to, for example, prevent a region from slipping back into poverty.
In economic forecasting, research has shown that trends in increasing or decreasing volumes of housing-related search queries in Google are a more accurate predictor of house sales in the next quarter than the forecasts of real estate economists.
Big Data has its perils, to be sure. With huge data sets and fine-grained measurement, statisticians and computer scientists note, there is increased risk of “false discoveries.” The trouble with seeking a meaningful needle in massive haystacks of data, is that “many bits of straw look like needles.”
Data is tamed and understood using computer and mathematical models. These models, like metaphors in literature, are explanatory simplifications. They are useful for understanding, but they have their limits. A model might spot a correlation and draw a statistical inference that is unfair or discriminatory, based on online searches, affecting the products, bank loans and health insurance a person is offered, privacy advocates warn.
Despite the caveats, there seems to be no turning back. Data is in the driver’s seat. It’s there, it’s useful and it’s valuable, even hip. It’s a revolution. We’re really just getting under way. But the march of quantification, made possible by enormous new sources of data, will sweep through academia, business and government. There is no area that is going to be untouched.
There is plenty of anecdotal evidence of the payoff from data-first thinking. The best-known is still “Moneyball,” the 2003 book by Michael Lewis, chronicling how the low-budget Oakland A’s massaged data and arcane baseball statistics to spot undervalued players. Heavy data analysis had become standard not only in baseball but also in other sports, including English soccer, well before last year’s movie version of “Moneyball,” starring Brad Pitt.
Artificial-intelligence technologies can be applied in many fields. For example, Google’s search and ad business and its experimental robot cars, have navigated thousands of miles of California roads, both use a bundle of artificial-intelligence tricks. Both are daunting Big Data challenges, parsing vast quantities of data and making decisions instantaneously.
The wealth of new data, in turn, accelerates advances in computing – a virtuous circle of Big Data. Machine-learning algorithms, for example, learn on data, and the more data, the more the machines learn. Take Siri, the talking, question-answering application in iPhones, which Apple introduced last fall. Its origins go back to a Pentagon research project that was then spun off as a Silicon Valley start-up. Apple bought Siri in 2010, and kept feeding it more data. Now, with people supplying millions of questions, Siri is becoming an increasingly adept personal assistant, offering reminders, weather reports, restaurant suggestions and answers to an expanding universe of questions.
Google searches, Facebook posts and Twitter messages, for example, make it possible to measure behavior and sentiment in fine detail and as it happens. In business, economics and other fields, decisions will increasingly be based on data and analysis rather than on experience and intuition.
Retailers, like Walmart and Kohl’s, analyze sales, pricing and economic, demographic and weather data to tailor product selections at particular stores and determine the timing of price markdowns. Shipping companies, like U.P.S., mine data on truck delivery times and traffic patterns to fine-tune routing. Police departments across the country, led by New York’s, use computerized mapping and analysis of variables like historical arrest patterns, paydays, sporting events, rainfall and holidays to try to predict likely crime “hot spots” and deploy officers there in advance. Data-driven decision making” achieved productivity gains that were 5 percent to 6 percent higher than other factors could explain.
(This is the second of series of posts on Big data and the Internet of Things. Read the first post here.)
With a 18 fold increase expected in the next 5 years timeframe Data is the new class of economic asset, like currency or gold.With growing multiplicity of data sources, Big Data has the potential to be “humanity’s dashboard,” an intelligent tool that can help combat poverty, crime and pollution. Privacy advocates take a dim view, warning that Big Data is Big Brother, in corporate clothing.
What is Big Data? A meme and a marketing term, for sure, but also shorthand for advancing trends in technology that open the door to a new approach to understanding the world and making decisions. There is a lot more data, all the time, growing at 50 percent a year, or more than doubling every two years, estimates IDC. It’s not just more streams of data, but entirely new ones. For example, there are now countless digital sensors worldwide in industrial equipment, automobiles, electrical meters and shipping crates. They can measure and communicate location, movement, vibration, temperature, humidity, even chemical changes in the air.
Linking these communicating sensors to computing intelligence and gives rise to what is called the Internet of Things or the Industrial Internet. Improved access to information is also fueling the Big Data trend. For example, government data – employment figures and other information – has been steadily migrating onto the Web. In 2009, Washington opened the data doors further by starting Data.gov, a Web site that makes all kinds of government data accessible to the public.
Data is not only becoming more available but also more understandable to computers. Most of the Big Data surge is data in the wild – unruly stuff like words, images and video on the Web and those streams of sensor data. It is called unstructured data and is not typically grist for traditional databases. But the computer tools for gleaning knowledge and insights from the Internet era’s vast trove of unstructured data are fast gaining ground. At the forefront are the rapidly advancing techniques of artificial intelligence like natural-language processing, pattern recognition and machine learning.
2011 is perhaps Indian telecom’s biggest year yet with launch of 3G services, mobile number portability implementation and 175 million new mobile phone subscribers in 10 months, taking the total subscriber base to 881 million.
However there have been challenges for the Indian telecom businesses-
1. 93 percent of users are low-spending pre-paid users
2. A low ARPU together with high energy costs for the diesel backup for a half-million towers, it’s a struggle for margins.
3. 3G licenses have come at a very heavy cost and the impact is in terms of cash strapped operations for many Telcos. The government made a lot of money and squandered off a little more, but that is a different story.
4. Inspite of huge investments on 3G, Poor user experience and a lack of content failed to draw users, killing all operator hopes of recovering that money.
5. Mobile number portability: 25 million users applied to switch operators while retaining their number, with 2.5 million requests pouring in each month. The churn is also taking its toll as Operators are responding with tariff cuts and deals.
A few future defining trends also shaped up in 2011 as markets evolved, matured and consolidated:
1. 2011 is possibly the year, when the Indian Telecom Industry moved up from a entry to a replacement market. The new sub adds plummeted to 6-7 million per month as against an average of 15-20 million activations in 2010.
2. Data emerges the hero as Telecom starts evolving from a some-what voice centric industry. 2011 should herald the decade of data for India with preliminary 3G and EVDO Rev. B launches. LTE is round the corner.
3. New classes of devices such as Smartphones and Tablets in the entry level with advanced OSs and application capabilities widen the consumer choice as well as the experience. Low cost Androidss are driving smartphone adoption rapidly across in ~$80 price segments
4. Tariffs bottoming out, Indian Telcos look for the next springwell of revenue and profits and new revenue models would start to emerge. Operators are looking at various VAS aided business models to augment their margins and profits.
5. The Aakash Tablet (and NotionInk’s Adam before that) established India’s status as a low cost innovator. Going forward with the markets in SE Asia and Africa being key to telecom growth, India will feature as a global innovation and R&D centre
6. The government has announced the NTP (National Telecom Policy) which is a proactive step in terms of defining telecom sector businesses going forward. The industry awaits greater clarity on a few issues such as mergers and acquisitions and we will see things get more clear and better as wel go along.
The profusion of smartphones and other data centric devices are pushing the limits of network traffic and bandwidth. As Mobile networks underwent fundamental changes in the evolution from 1G to 4G. Network speeds, the number of users and the diversity of applications and services have skyrocketed. These changes are forcing operators to rethink their network-management strategies — not a minor tweak, but a major overhaul.
Comprehensive network management strategies and services now enable operators to avoid becoming dumb pipes. The first step is completely rethinking how to manage their networks.
Before mobile data became popular, operators focused on engineering their networks. Now, operators are shifting their focus from networks to traffic. With infrastructure that provides packet-level insights into that traffic, operators now can identify different traffic types and apply a specific policy to each one. For example, operators can dynamically allocate bandwidth and loadbalancing links to improve latency and throughput. As a result, they can use Quality of (network) Services as a powerful market differentiator. The key is to understand how customers use services and the network resources associated with that usage. Hence the differentiator is based on the network’s ability to gather broader, deeper, real-time information about user sessions.
This allows operators to engineer applications, including managing traffic and dynamically provisioning resources, to ensure all applications deliver the best possible performance.
Operators also can use the network’s awareness of user content and context to deliver services tailored to each subscriber’s usage patterns. For example, operators might create a service targeted at parents with family payment plans so they can monitor their children’s activities. Another service might cater to users who watch a lot of video on their devices by prioritizing video over other applications. These are few examples of how application engineering enables operators to reduce costs, create additional revenue streams and improve the user experience.
Thus the concept of smart networks is based on the following 4 main aspects:
• Visibility. See exactly what applications customers use, where the network hot spots are and what’s causing those hot spots.
• Control. Prioritize traffic, set policies and block traffic, if necessary.
• Optimization. There are 2 aspects of optimization. The first is capacity and efficient use of network resources. The second is optimizing the quality of experience for users.
• Monetization. Get new revenue streams from the applications and services on the network
Using an intelligent/smart networks operators can offload up to 70% of Internet traffic at the network edge. That offload increases core network efficiency, improves the user experience and reduces CapEx by up to 50%. This approach enables operators to use their resources more efficiently and apply the packet core platform/network based intelligence to dynamically offload traffic.
The smart networks take the operators out of the unsustainable dumbpipe business. and helps them reduce expenses, create new revenue streams and strengthen profitability. Just as important, it makes it possible to engineer and optimize the user experience. That translates into stronger customer loyalty.
Data is big… just how big is something that isnt expressed in numbers really! Not at least till you have the metaphors of data in the context of common understanding. In this post today, i feature a presentation from Cisco’s Dave Evans. Apart from the data metaphors, it is amazing to see how our lives will change in next 10 years due to technology advancements.
Cognitive radio technologies includes the ability of devices to determine their location, sense spectrum use by neighboring devices, change frequency, adjust output power, and even alter transmission parameters and characteristics.A cognitive radio is a transceiver that is able to understand and react to its operating environment. Thus cognitive radio concerns devices and networks which are computationally intelligent about radio resources and related communications to detect user communication needs as a function of use context and provide radio resources and wireless services appropriate to those needs. Thus the Radio is aware/cognitive about changes in its environment and responds to these changes by adapting operating characteristics in some way to improve its performance or minimize loss in performance.
• At one extreme, is an intelligent device that can reconfigure itself to interact with any radio network in the vicinity, depending on the requirements of the user
• At the other extreme, there is a intelligent device that detects interference and change their operating frequency to avoid it.
Spectral occupancy measurements consistently show that some bands are under utilized in some areas at some times. Recent measurements by the FCC in the US show 70% of the allocated spectrum is not utilized. Time scale of the spectrum occupancy varies from milli-secs to hours. Cognitive Radio increases the utilization of the Radio Spectrum and decreases spectrum holes and white spaces. Thus Spectrum holders are able to use their spectrum more efficiently and sub license it further and supports new models not directly tied to spectrum availability. In short it could facilitate spectrum trading.
The potential benefits include expansion of critical communication networks, higher date rate services to users, enhanced coverage, more extensive device roaming and cost management.Cognitive radio is a promising technology that can significantly enhance utilization of radio spectrum and has the potential to facilitate new spectrum trading approaches and business models.
I have been writing about the how a price war in the Indian Telecom industry would be a future in vain. I have also written about why Telcos should explore future in data and consumer centric services rather than price wars. The pay per second bloodbath was clearly inevitable. However, i cant think, why would Large Telcos in the country miss the trick. A price led strategy would never be sustainable, and yet the whole industry seems to have rushed into 1 paise per second formula. Am i missing something?
The explosion of subscribers in India has put a lot of pressure on the existing telecom infrastructure and the frequencies available. While there have been efforts like sharing of infrastructure between operators which has allowed to keep Capex under control, there are also minefields such as Mobile Number Portability which adds a lot of uncertainty relating to the subscriber adds and churn. 3G is seen as an answer to the lack of bandwidth, but the license fees demanded by the government is exorbitant and will require long periods of gestation. India has also attracted players like MTS, Telenor, Etisalat, DoCoMo adding a lot of uncertainty in the existing market conditions.
The principal source of operator revenue is voice and data. (Data services here also implies SMS and MMS services). Under the present bandwidth shortages, existing operators have only been able capitalize on the voice led growth. SMS is the only the significant other contributor to revenues in Indian telecom eco-system. The current contribution of data services to operator revenues range from 8 to 11%. This includes SMS and also includes the Tata Teleservices and Reliance CDMA connections, which are typically data heavy services. GSM’s data revenues would be much lesser than CDMA. India being a 80% GSM country, the ratio of data services to Telecom services thus lags the international numbers. World over the higher percentage of Data revenue balances the fall in ARPU.(The Data ARPUs are on the rise globally). In India, data services provide no such safety net.
A case in point is the US telecom market which is the world’s highest consumer of telecom based data solutions. Over the last 5 years, Data ARPU has increased 7X while the Voice ARPU has reduced by 30% in the same period. A $15 dollar ARPU loss in Voice has been compensated by a $12 increase in Data ARPU. One might argue that this be the case in US which is a 3G country. But the point made in Indian context though different in regulatory and eco-system aspects, draws from this example.
- While voice tariffs in India is the lowest, Data tariffs in India are amongst the highest in the world. Cost being an important determinant of penetration, higher data costs have acted as barriers to data spread.
- Application and Content Revenue sharing models sometimes make it difficult for higher levels of applications to be built because of cost/higher break even periods. Even if applications are made, the revenue sharing with Telcos in India, would make it difficult for the Apps provider to advertise or communicate the offering to consumers.
- There is little in terms of consumer services to High ARPU consumers. Telcos in India could perhaps learn from Indian banks a few lessons in differential treatment of HNIs.
- All this time, Telcos in India have done little to tie up with content providers such as Googles of the world. LBS, Maps, Navigation and Social Networking could have been a great apps. This is a “Blue ocean” where Telcos have not ventured yet at all.
- The CEOs of one of the biggest Telcos in India once dismissed the MVNOs as “loss making”. Perhaps it is time to re-think strategy in terms of branded and exclusive content. (Read Report)
- All this while, Nokia has been preparing its platforms to differentiate itself through services. Telcos were in a far better position to aggregate service bundles and yet they didnot. Did all of them miss the trick? Did they fall into the Operator Dumb Pipe syndrome trap!
So, when it was sunny, all the Telcos in India did was to make good subscriber numbers in falling ARPUs. That was the low lying fruit. Nobody perhaps looked at the next levels, because they were rolling in money anyways. The bloodbath in terms of per second tariffs is now catching the Telcos. They still prefer to look at market shares rather than the EVAs and Bottom-lines.
Next story in making would be the inevitable shake out and age of acquisitions.
Please feel free to rate the post and put your comments (negative or positive as may be!)
Read the earlier posts here: