Ronnie05's Blog

Integrating the elements of convergence: The case for APIs

Posted in Internet and Search, Mobile Data & Traffic, The cloud and the open source by Manas Ganguly on February 28, 2012

Convergence has been the buzz word for a good part of the last decade and will continue to do so in this decade as well. However, for the discerning the definition or at least the meaning of convergence has now shifted from device convergence to technology convergence. The later being the superset of which devices are just another maifestation. So earlier its was the camera, the mobile phone, the GPS, the MP3 player and other such device charecterestics that really converged. However, in the present context it is the convergence of enabling technologies and the three big technologies that seem to be convergent at this time are: Mobility, Cloud Services and Big Data.

However, it is a relatively small lynchpin that drives the convergence of these three mega trends. Small in terms of what it is, but large in terms of the innovation spurts that it provides. The key here is APIs or Application programming Interfaces. APIs tie together the mega-trends in a fundamental and unalterable way. APIs are the lingua franca of the new wave in internet of all things combined with super mobility and seamless connectivity. In my mind, each of these three technology trends (on their own) will be on the fast track to commoditization and will risk facing the same fate as did most social business software plays. The magic and the premiums will come from contextual application of this innovation and smart integration.

To stake a few examples, Box.net as storage without document and device sync and collaboration is commodity. Apple’s iCloud as storage without ubiquitous local and iTunes media sync across devices is commodity. And Google Drive (as discussed here in Ben Kepes’ CloudU community) is also a commodity business not worth getting into had it not been for Google’s services such as Google Apps, Piccasa, and its media and unified communication capabilities under the Google Plus brand.

The premiums from big data, mobile access and cloud comes from
a) dynamically assembled media and content, and interpreted data in the cloud,
b) available wherever you need to consume and / or collaborate and
c) insanely focused and simple interfaces to complex backends.

Thats where money would be made in these commoditized services. APIs provide the integration through the value creation network. The only other differentiator in this case being experience!

Advertisements

Indian Telecom: The next round of spectrum auction would have to be telecom’s path into the future

Posted in Industry updates by Manas Ganguly on February 23, 2012

Telecom, especially wireless, was supposed to be India’s ticket to economic development. Most operators contend that the government is trying to extract too much from them. The top four firms for instance must have paid close to $15 billion over the last four years itself as revenue share, licence fee, service tax and spectrum charges. Industry watchers contend that if they (the government) tries and extracts more, it will only be counter-productive.

One of the biggest obstacles faced by operators when it comes to breaking even is cost of spectrum (when you consider the rock bottom tariffs), especially when acquired through auctions. In India, spectrum is expensive, scarce and unpredictable. Those need to be changed for operators to be able to innovate around long-term investment plans.

Auctions must encourage competition between not just players but also technologies, say, 2G vs. 3G or GSM vs. CDMA. Technology neutrality is absolutely critical because just like you don’t want a player to have an undue advantage, you should also make sure a technology too doesn’t have one. When regulators pick a technology before the auctions, they’re essentially picking winners

Doing this allows savvy operators to marry the most efficient technology to the spectrum they purchase. Naturally, they’ll pick one that’ll have maximum consumer value.

A unified licence should mean an operator is free to offer whatever combination of technology [it] wants, even if that means a 2G plus LTE data card. The unified licence automatically takes care of revenue leakage.

There is also a case for offering slightly preferential treatment to those operators whose licences got scrapped and are re-bidding, versus those that are completely new. Globally, it’s a well-accepted strategy among regulators to reserve blocks of spectrum for new entrants and provide them certain handicaps. But that is now a political question in India. Incumbents believe that subjectivity and artificial restriction on the number of bidders will only lead to market distortion.

The long-term goal for the regulator must be to figure how to bring more spectrum for auction, more frequently. What is considered reasonable spectrum in most markets like the US and Russia for an operator is 15-20 MHz? That should actually be higher in India, given the size of our market. On the contrary Indian operators have to contend with 4X the number of consumers on networks working on fourth of the spectrum.

Unfortunately we have too much spectrum divided in narrow slices across too many operators, in itself leading to greater wastage. Every time spectrum is divided between two operators, guard bands are reserved around them to prevent accidental interference between them. The more the number of operators, the more the bands. About 15 percent of our spectrum is currently lost due to such guard bands, and in some cases I’ve even heard of 20 percent

After issues around the migration of Defence spectrum are sorted out, next up should be a comprehensive spectrum plan that includes more of lower bands that offer greater propagation at lower cost. Unnecessary secrecy and lack of transparency are three-fourths of the problem in telecom. If you reveal all relevant information in an auction, then players can decide for themselves

Indian Telecom’s New Deal too will need the “Three Rs”—Relief for serious operators and poorly served consumers, Recovery of the sector’s vast potential and Reform in order to bring lasting transparency and fairness.If we don’t get it right this time, the next 10 years of Indian telecom will be like the last 10 years.

This crisis really, never mind the cliché, is an opportunity.

Re-thinking Indian Telecom

Posted in Industry updates by Manas Ganguly on February 22, 2012

Continued from earlier post- Indian Telecom: Under seige

Focus in Indian Telecom now needs to shift from infrastructure creation and network rollout towards reuse of resources and innovation. The concept of tower sharing that began four to five years ago was a great milestone. It showed operators realising there is more value in sharing assets than in keeping them to one’s self. Therefore, Instead of penalising operators for sharing scarce resources, as the drive against “illegal 3G roaming” indicates, the regulator should encourage all forms of asset sharing, including spectrum, provided consumers aren’t being short-changed.

The legislators also need a MVNO policy to allow MVNOs(virtual operators who lease the infrastructure from other operators) into the market eco-system. This enables idle assets become economically productive platforms for consumer innovation. Once this freedom is established, the wholesale market will naturally begin to evolve. Players like Lightsquared, a company that is building a $7 billion wholesale-only 4G network across the USA, could make eminent sense in broadband-hungry India.

Wholesale should be very seriously encouraged. In fact, for BSNL it could be a brilliant option, given its huge strength in infrastructure but almost nothing on the retail side.

The government needs to Encourage mergers and acquistions, by doing away with all restrictions on spectrum held and increasing the combined market share level in consultation with the Competition Commission of India (CCI). Playing the regulator by imposing artificial constraints on mergers, prevents a free play of market forces. When it comes to mergers, fears of cartelisation or monopolisation are almost always overblown.

The concept of regulators determining the number of operators is itself outdated. Regulating the telecom sector requires certain expertise which a government ministry cannot be expected to have. Today we have technological and economic solutions like spectrum and network sharing which should dictate the sectoral dynamics. In fact, the concept of an operator being an end-to-end entity needs to be rethought. They can be just a customer facing entity. Regulators should become innovative and forward looking, and behave like economists. Unfortunately, many dont think and cannot understand and comprehend their basic tasks and responsibilities. Regulators today are still stuck in the pre-91 Hindu-rate-of-growth babudom dominated decision taking authoritarian culture. The market and the economy has evolved multifold and should be treated as a play of multiple market forces where the rule of regulator is to set a level playing field and safeguard customer interest against cartelization.

Big Data: Controlling the beast by its horns

Posted in Mobile Data & Traffic by Manas Ganguly on February 20, 2012

(This is the third of series of posts on Big data and the Internet of Things. Read the first, second and third posts here.)

I look for hot spots in the data, an outbreak of activity that I need to understand. It’s something you can only do with Big Data.” – Jon Kleinberg, a professor at Cornell

Researchers have found a spike in Google search requests for terms like “flu symptoms” and “flu treatments” a couple of weeks before there is an increase in flu patients coming to hospital emergency rooms in a region (and emergency room reports usually lag behind visits by two weeks or so).Global Pulse, a new initiative by the United Nations, wants to leverage Big Data for global development. The group will conduct so-called sentiment analysis of messages in social networks and text messages – using natural-language deciphering software – to help predict job losses, spending reductions or disease outbreaks in a given region. The goal is to use digital early-warning signals to guide assistance programs in advance to, for example, prevent a region from slipping back into poverty.

In economic forecasting, research has shown that trends in increasing or decreasing volumes of housing-related search queries in Google are a more accurate predictor of house sales in the next quarter than the forecasts of real estate economists.

Big Data has its perils, to be sure. With huge data sets and fine-grained measurement, statisticians and computer scientists note, there is increased risk of “false discoveries.” The trouble with seeking a meaningful needle in massive haystacks of data, is that “many bits of straw look like needles.”
Data is tamed and understood using computer and mathematical models. These models, like metaphors in literature, are explanatory simplifications. They are useful for understanding, but they have their limits. A model might spot a correlation and draw a statistical inference that is unfair or discriminatory, based on online searches, affecting the products, bank loans and health insurance a person is offered, privacy advocates warn.

Despite the caveats, there seems to be no turning back. Data is in the driver’s seat. It’s there, it’s useful and it’s valuable, even hip. It’s a revolution. We’re really just getting under way. But the march of quantification, made possible by enormous new sources of data, will sweep through academia, business and government. There is no area that is going to be untouched.

Smartphones and Tablets to shape personal computing by 2015

Posted in Uncategorized by Manas Ganguly on February 18, 2012

Global Tablet sales to end users reached 67.0 million units in 2011 and is expected to reach 248.6 million units by the end of 2015, growing at a CAGR of 38.8% from 2011 to 2015. Asia – Pacific (including Japan) is expected to enjoy the highest share of overall global shipments and end user sales of Tablets at 36.1% and 35.3% respectively in 2015.

In 2011, Smartphone sales to end users reached 469.9 million units, registering a growth of 66.7% over 2010 sales of 282.0 million units. The Smartphone sales to end user are expected to reach 1,048.0 million units by 2015 with Asia – Pacific accounting for the largest market share at 39.5%. Asia Pacific is also expected to enjoy the highest growth rate at a CAGR of 36.3% from 2010 to 2015.

The Tablet sales to end user increased by 276.5% in 2011 from 17.8 million units sold to end users in 2010. The growth in sales is largely attributed to consumer response to Apple’s iPad 2, Samsung’s Galaxy Tab & Galaxy Tab 2 and Amazon’s Kindle Fire Tab. Launched in November 2011, Amazon’s USD 199 Tab; Kindle Fire, had incredible response among consumers and managed around 3.5 million units sales in last 45 days of the year 2011. Globally, the installed base of Tablet devices have reached 81.2 million units in 2011 and expected to reach 388.8 million units by the end of 2015.

Consumer segment is the largest adopter of media Tablet devices, while business users prefer communicators. Media Tablets is expected to remain the largest Tablet device segment with over 60% sales share in 2015, while hybrid segment will account for more than one-fourth of the sales in the same year. Smartphones are becoming more ubiquitous communication devices among all user segments with almost 75% of smartphone consumer (individual) subscriber use their smartphones for personal as well as business purposes. Moreover, 65% of global SMBs now allow employee owned smartphone for official use. This acted as the strong booster for Smartphone market growth. The smartphones market grew by 66.7% during last year and sales reached to 469.9 million units in 2011. Smartphone sale in 4Q2011 alone crossed the combined sales of all the four quarters of 2008. This leap in sales came on account of consumer as well as enterprise adoption of iPhone 4S, which posted 36.1 million units sales to end user in Q42011 alone.

Courtesy: Transparency market Research

 

Tagged with: ,

An alternative model in Spectrum Licensing

Posted in Uncategorized by Manas Ganguly on February 18, 2012

Think of the spectrum like the village commons

In a shock ruling ten days ago, the Supreme Court cancelled 122 mobile phone licences that had been deceitfully awarded in 2008. The ruling sent the telecom industry into chaos, confirmed dreadful corruption in the government’s decision-making process, and damaged the reputation of our nation in the eyes of the world—especially foreign investors. There was much euphoria inside the country, however, for justice had seemingly been served.

The Supreme Court also instructed the telecom regulator (TRAI) to auction the illegally gained 2G spectrum, as it was done in the case of 3G spectrum. “While transferring or alienating natural resources, the state is duty bound to adopt the method of auction by giving wide publicity so that all eligible persons can participate,” said the court judgment. Auctions are certainly a better way to allocate a scare resource than first come first served, but what former telecom minister A Raja did was, of course, preposterous — he subverted the “first come first serve” policy by changing rules mid-way; he allocated spectrum out-of-turn in a non-transparent manner, and that too at 2001 prices, thus creating the biggest corruption scandal in India’s history.

But is auctioning the best way to allocate radio spectrum? Although it is scarce, should it be used as a money-making device by government? Since water is scarce, should it be auctioned? No. The risk in an auction is that “animal spirits” of entrepreneurs forces them to bid very high, which is then reflected in high tariffs, and this forces the poor out of the market. Thirty-one countries have used spectrum auctions and many have regretted it for this reason. India is, perhaps, the world’s most successful telecom market with the lowest tariffs in the world. Hence, it has the highest number of subscribers who are poor. The credit for this goes to the previous government which had the courage to change policy from high license fees to revenue sharing between the telephone company and the government. If the state had been “duty bound to hold an auction”, cell phones would not have reached the poor.

In the ideal world, radio spectrum would be like sunshine which is not owned by anyone or any government but everyone enjoys it without any cost. But unlike sunshine, spectrum is finite and hence it has been historically controlled by governments. It is widely accepted that government allocation is inefficient and leads to corruption. Ronald Coase, the Nobel Prize winner, exploded the myth long ago that governments should control spectrum to prevent airwave chaos. Today many experts think of spectrum as a common grazing ground around a village, which is open to everyone to use freely. They claim that new spectrum-sharing technologies allow a virtually unlimited number of persons to use it without causing each other interference-this eliminates the need for either property rights or government control. This is why the United States has gone ahead and designated a 50 MHz block of spectrum in the 3650 MHz band as a “commons”.

If the spectrum were a “commons” nobody would own it nor need to auction it. A telecom company would merely register with an authority, which would assign it a spectrum frequency for its use. When the company reached the limit of its spectrum, the authority would release it some more. Just as a villager pays a nominal tax for maintaining the commons, depending on how many cattle he grazes, each cell phone subscriber would pay a nominal fee, say a paisa per minute, towards upkeep of the spectrum. It would be form a part of the monthly bill, and transferred by the phone company to the authority. Just as a village needs rules to prevent over-grazing, there would be rules in maintaining spectrum to avoid a “tragedy of the commons”. The rules would be transparent, monitored in real time, and no one would be able to corner the spectrum.

Unfortunately, the Supreme Court judgment has come out so heavily prescriptive in favour of auctions that future governments in India will be shy to adopt a better alternative. Technology is developing very rapidly and soon the world will be ready for an “open spectrum” regime, but the court’s inflexible judgment will inhibit the Indian government in doing the right thing. A pity!

Reproduced from an article by Gurcharan Das in ToI

Gartner: Q4, 2011 Mobile Phone and Smartphone Market shares

Posted in Industry updates by Manas Ganguly on February 17, 2012

Worldwide smartphone sales to end users soared to 149 million units in the fourth quarter of 2011, a 47.3 per cent increase from the fourth quarter of 2010. Total smartphone sales in 2011 reached 472 million units and accounted for 31 percent of all mobile devices sales, up 58 percent from 2010.

 Apple became the third-largest mobile phone vendor in the world, overtaking LG and the world’s top smartphone vendor, with a market share of 23.8 percent in the fourth quarter of 2011, and the top smartphone vendor for 2011 as a whole, with a 19 percent market share.

LG, Sony Ericsson, Motorola and Research In Motion (RIM) again recorded disappointing results as they struggled to improve volumes and profits significantly. These vendors were also exposed to a much stronger threat from the midrange and low end of the smartphone market as ZTE and Huawei continued to gain share during the quarter. 

Worldwide mobile device sales to end users totaled 476.5 million units in the fourth quarter of 2011, a 5.4 percent increase from the same period in 2010. In 2011 as a whole, end users bought 1.8 billion units, an 11.1 percent increase from 2010.

Indian Telecom: Under seige!

Posted in Industry updates by Manas Ganguly on February 17, 2012

Bad governance and regulation in India is an equal opportunity offender. UPA’s errors of omissions & alleged commissions beginning 2004, has all but ruined one of the best stories of the Indian economy—telecom.

While new entrants struggle for their very right to exist in India, older incumbents are currently staring at the prospect of being “retroactively” charged an estimated Rs. 37,000 crore each for 10 years-worth of “excess spectrum” held by them. State-owned Telcos are being progressively run to the ground, earnest private operators have been waiting for start-up spectrum in key markets for any number of years. When a foolhardy new entrant pops up, and when it bid $1 billion for 4G licenses in four lucrative markets, its application was rejected for thoroughly frivolous reasons.

A mish-mash of secretive and uninformed regulators—the ministry of Telecom, TRAI, Department of Telecom, TDSAT—work at cross-purposes with each other. Operators and other stakeholders come to know of regulations via frequent “leaks” to the media, in most cases traced back to unauthorised photocopies of official documents that are palmed off and sold by junior staffers. The phrase “regulation through photocopy” would not be out of place, most stakeholders agree in private.

With nearly 900 million subscribers across seven to 15 operators (depending on whether you include the ones whose licenses were cancelled), regulators ought to understand Indian Telecom is no longer in its infancy. The whole idea of prescribing how an eighth, ninth or tenth operator should roll out their networks doesn’t make sense today. Consumers need service competition, not multiplicity of infrastructure. There is a strong belief that this logic of introducing many more operators in each regional market may not be consistent with how wireless telecom has evolved across the world. According to the credit rating agency CARE’s research, on an average there were 15 players (both GSM and CDMA) in a telecom circle in India as compared to three in Singapore, three in China, four in Mexico and five in the US.

Out of the total 122 cancelled licences, 39 licence areas (about 32 percent) are under-utilised as these many operators did not even have 1,000 subscribers in many circles as of December 2011, implying the services were not fully rolled out. And here’s a telling comment on the state of the industry: After the cancellation of licences, average number of operators will come down to around nine to 10.

This indicates that further entry of operators beyond four or five does not significantly increase the competitiveness of the market. And what do the fringe guys have to show for their efforts? A $2 billion pan India network without any subscribers!

(to be continued)

Tagged with: ,

Channelizing and Structuring Big Data: Data First Thinking

Posted in Mobile Data & Traffic by Manas Ganguly on February 16, 2012

(This is the third of series of posts on Big data and the Internet of Things. Read the first and second posts here.)

There is plenty of anecdotal evidence of the payoff from data-first thinking. The best-known is still “Moneyball,” the 2003 book by Michael Lewis, chronicling how the low-budget Oakland A’s massaged data and arcane baseball statistics to spot undervalued players. Heavy data analysis had become standard not only in baseball but also in other sports, including English soccer, well before last year’s movie version of “Moneyball,” starring Brad Pitt.

Artificial-intelligence technologies can be applied in many fields. For example, Google’s search and ad business and its experimental robot cars, have navigated thousands of miles of California roads, both use a bundle of artificial-intelligence tricks. Both are daunting Big Data challenges, parsing vast quantities of data and making decisions instantaneously.

The wealth of new data, in turn, accelerates advances in computing – a virtuous circle of Big Data. Machine-learning algorithms, for example, learn on data, and the more data, the more the machines learn. Take Siri, the talking, question-answering application in iPhones, which Apple introduced last fall. Its origins go back to a Pentagon research project that was then spun off as a Silicon Valley start-up. Apple bought Siri in 2010, and kept feeding it more data. Now, with people supplying millions of questions, Siri is becoming an increasingly adept personal assistant, offering reminders, weather reports, restaurant suggestions and answers to an expanding universe of questions.

Google searches, Facebook posts and Twitter messages, for example, make it possible to measure behavior and sentiment in fine detail and as it happens. In business, economics and other fields, decisions will increasingly be based on data and analysis rather than on experience and intuition.

Retailers, like Walmart and Kohl’s, analyze sales, pricing and economic, demographic and weather data to tailor product selections at particular stores and determine the timing of price markdowns. Shipping companies, like U.P.S., mine data on truck delivery times and traffic patterns to fine-tune routing. Police departments across the country, led by New York’s, use computerized mapping and analysis of variables like historical arrest patterns, paydays, sporting events, rainfall and holidays to try to predict likely crime “hot spots” and deploy officers there in advance. Data-driven decision making” achieved productivity gains that were 5 percent to 6 percent higher than other factors could explain.

Big Data and the Internet of Things.

Posted in Mobile Data & Traffic by Manas Ganguly on February 15, 2012

(This is the second of series of posts on Big data and the Internet of Things. Read the first post here.)

With a 18 fold increase expected in the next 5 years timeframe Data is the new class of economic asset, like currency or gold.With growing multiplicity of data sources, Big Data has the potential to be “humanity’s dashboard,” an intelligent tool that can help combat poverty, crime and pollution. Privacy advocates take a dim view, warning that Big Data is Big Brother, in corporate clothing.

What is Big Data? A meme and a marketing term, for sure, but also shorthand for advancing trends in technology that open the door to a new approach to understanding the world and making decisions. There is a lot more data, all the time, growing at 50 percent a year, or more than doubling every two years, estimates IDC. It’s not just more streams of data, but entirely new ones. For example, there are now countless digital sensors worldwide in industrial equipment, automobiles, electrical meters and shipping crates. They can measure and communicate location, movement, vibration, temperature, humidity, even chemical changes in the air.

Linking these communicating sensors to computing intelligence and gives rise to what is called the Internet of Things or the Industrial Internet. Improved access to information is also fueling the Big Data trend. For example, government data – employment figures and other information – has been steadily migrating onto the Web. In 2009, Washington opened the data doors further by starting Data.gov, a Web site that makes all kinds of government data accessible to the public.

Data is not only becoming more available but also more understandable to computers. Most of the Big Data surge is data in the wild – unruly stuff like words, images and video on the Web and those streams of sensor data. It is called unstructured data and is not typically grist for traditional databases. But the computer tools for gleaning knowledge and insights from the Internet era’s vast trove of unstructured data are fast gaining ground. At the forefront are the rapidly advancing techniques of artificial intelligence like natural-language processing, pattern recognition and machine learning.

%d bloggers like this: