Wednesday, December 24, 2008

Nokia Is The New Blackberry Of The Emerging Countries

Nokia announced mobile email service, Mail on Ovi, currently targeting the emerging markets.
Nokia has had great success in selling reliable and inexpensive handsets in the emerging markets. In the countries such as India the consumers never used the voice mail on their landlines and went through the mobile revolution to use SMS as a primary asynchronous communication medium. Many of these users are not active email users, not at least on their mobile devices. If Nokia manages to provide ubiquitous user experience using Ovi to bridge email and SMS on not-so-advanced-data-networks it can cause disruption by satisfying asynchronous communication needs of hundreds of thousands of users.

The smartphones would certainly benefit out of this offering and give Blackberry a good run for their money. Nokia completed the Symbian acquisition that makes them a company whose OS powers 50% of all the smartphones in the world. Symbian is still a powerful operating system powering more than 200 million phones and it is open source and it is supported by Nokia. The emerging countries haven't yet gone through the data revolution and Nokia is in great position to innovate.

Friday, December 19, 2008

De-coupled Cloud Runtime And Demand-based Pricing Suggest Second Wave Of Cloud Computing

A couple of days back Zoho announced that the applications created using Zoho Creator can now be deployed on the Google cloud. On the same day Google announced their tentative pricing scheme to buy resources on their cloud beyond the free daily quota. We seem to have entered into the second wave of the cloud computing.

Many on-demand application vendors, who rely on non-cloud based infrastructure, have struggled to be profitable since the infrastructure cost is way too high. These vendors still have value-based pricing for their SaaS portfolio and cannot pass on the high infrastructure cost to their customers. The first wave of the cloud computing provided a nice utility model to the
customers who wanted to SaaS up their applications without investing into the infrastructure and charge their customers a fixed subscription. As I observe the second wave of the cloud computing a couple of patterns have emerged.

Moving to the cloud, one piece at time: The vendors have started moving the runtime to a third party cloud while keeping the design time on their own cloud. Zoho Creator is a good example where you could use it to create applications using Zoho's infrastructure and then optionally use Google's cloud to run it and scale. Some vendors such as Coghead are already ahead in this game by keeping the both, design-time and run-time, on Amazon's cloud. Many design tools that have traditionally been on-premise might stay that way and could help the end users to run part of their code on the cloud or deploy the entire application on the cloud. Mathematica announced their integration with Amazon's cloud where you can design a problem on-premise and send it to the cloud to compute. Nick Carr calls it the cloud as a feature

Innovate with the demand-based pricing: As the cloud vendors become more and more creative about how their infrastructure is being utilized and introduce demand-based pricing, the customers can innovate around their consumption. Demand-based pricing for the cloud could allow the customers to schedule the non-real time tasks of the applications based on when the computing is cheap. This approach will also make the data centers green since the energy demand is now directly based on computing demand that is being managed by creative pricing. This is not new for the green advocates. The green advocates have long been pushing for a policy change to promote variable-pricing model for the utilities that would base price of electricity on the demand against a flat rate. The consumers can benefit by their appliances and smart meters negotiating with the smart grid to get the best pricing. The utilities can benefit by better predicting the demand and make the generation more efficient and green. I see synergies between the cloud and green IT.

Thursday, December 4, 2008

Incomplete Framework Of Some Different Approaches To Making Stuff

Steve Portigal sent me an article that he wrote in the Interactions magazine asking for my feedback. Unfortunately the magazine is behind a walled garden and would require a subscription but if you reach out to Steve he should be able to share the article with you. In the absence of the original article I will take liberty to summarize it. Steve has described how companies generally go about making stuff in his “incomplete” framework:

  • Be a Genius and Get It Right: One-person show to get it right such as a vacuum cleaner by James Dyson.
  • Be a Genius and Get It Wrong: One-person show to get it wrong such as Dean Kamen’s Segway.
  • Don’t Ask Customers If This Is What They Want: NBA changing the basketball design from leather to synthetic microfiber without asking the players
  • Do Whatever Any Customer Asks: Implementing the changes as requested by the customers exactly as is without understanding the real needs.
  • Understand Needs and Design to Them: Discovery of the fact that women shovel more than men and designing a snow shovel for women.
I fully agree with Steve on his framework and since this is proposed as an incomplete framework let me add few things on my own:

Know who your real customer is:

For enterprise software the customer who writes the check does not necessarily use the software and most of the time the real end users who use the software have no say in the purchase or adoption process. Designing for such demographics is quite challenging since the customers’ needs are very different than user needs. For instance the CIO may want privacy, security, and control where as the end users may want flexibility and autonomy and to design software that is autonomous yet controlled and secured yet flexible is quite a challenge. As a designer pleasing CIO for his or her lower TCO goals and at the same time delighting end users gets tricky.

Designing for children as end users and parents as customers also has similar challenges.

Look beyond the problem space and preserve ambiguity:

Hypothesis-driven user research alone would not help discover the real insights. Many times the good design emerges out of looking beyond your problem space.

If Apple were to ask people what they would want in their phones people might have said they want a smart phone with a better stylus and they do not expect their phone to tell them where they should eat their dinner tonight. We wouldn’t have had a multimodal interface on iPhone that could run Urbanspoon.

Embracing and preserving the ambiguity as long as you can during the design process would help unearth some of the behaviors that could lead to great design. Ambiguity does make people uncomfortable but recognizing that fact that “making stuff” is fundamentally a generative process allows people to diverge and preserve ambiguity before they converge.

Monday, December 1, 2008

Does Cloud Computing Help Create Network Effect To Support Crowdsourcing And Collaborative Filtering?

Nick has a long post about Tim O'Reilly not getting the cloud. He questions Tim's assumptions on Web 2.0, network effects, power laws, and cloud computing. Both of them have good points.

O'Reilly comments on the cloud in the context of network effects:

"Cloud computing, at least in the sense that Hugh seems to be using the term, as a synonym for the infrastructure level of the cloud as best exemplified by Amazon S3 and EC2, doesn't have this kind of dynamic."

Nick argues:

"The network effect is indeed an important force shaping business online, and O'Reilly is right to remind us of that fact. But he's wrong to suggest that the network effect is the only or the most powerful means of achieving superior market share or profitability online or that it will be the defining formative factor for cloud computing."

Both of them also argue about applying power laws to the cloud computing. I am with Nick on the power laws but strongly disagree with him on his view of cloud computing and network effects. The cloud at the infrastructure level will still follow the power laws due to the inherent capital intensive requirements of a data center and the tools on the cloud would help create network effects. Let's make sure we all understand what the powers laws are:

"In systems where many people are free to choose between many options, a small subset of the whole will get a disproportionate amount of traffic (or attention, or income), even if no members of the system actively work towards such an outcome. This has nothing to do with moral weakness, selling out, or any other psychological explanation. The very act of choosing, spread widely enough and freely enough, creates a power law distribution."

Any network effect starts with a small set of something and it eventually grows bigger and bigger - users, content etc. The cloud makes it a great platform for such systems that demand this kind of growth. The adoption barrier is close to zero for the companies whose business model actually depends upon creating these effects. They can provision their users, applications, and content on the cloud and be up and running in minutes and can grow as the user base and the content grows. This actually shifts the power to the smaller players and help them compete with the big cloud players and yet allow them to create network effects.

The big cloud players, that are currently on the supply side of this utility mode, have few options on the table. They either can keep themselves to the infrastructure business and I would wear my skeptic hat and agree with a lot of people on the poor viability of this capital intensive business model that has very high operational cost. This option alone does not make sense and the big companies have to have a strategic intent behind such large investment.

The strategic intent could be to SaaS up their tools and applications on the cloud. The investment and control over the infrastructure would provide a head start. They can also bring in partner ecosystem and crowdsource large user community to create a network effect of social innovation that is based on collective intelligence which in turn would make the tools better. One of the challenges with the recommendation systems that uses collaborative filtering is to be able to mine massive information that includes users' data and behavior and compute the correlation by linking it with massive information from other sources. The cloud makes a good platform for such requirements due to its inherent ability to store vast amount of information and perform massive parallel processing across heterogeneous sources. There are obvious privacy and security issues with this kind of approach but they are not impossible to resolve.

Google, Amazon, and Microsoft are the supply side cloud infrastructure players that are already moving in the demand side of the tools business though I would not call them the equal players exploring all the opportunities.

And last but not the least, there is a sustainability angle around the cloud providers. They can help consolidate thousands of data centers into few hundreds based on the geographical coverage, availability of water, energy, and dark fiber etc. This is similar to consolidating hundreds of dirty coal plants into few non-coal green power plants that can produce clean energy with efficient transmission and distribution system.