Saturday, February 2, 2008

Monetizing social networks and preserving privacy - an oxymoron?

How do social networks monetize their core platform and applications? It's more than a billion dollar question, figuratively and literally. The social network companies such as Facebook does recognize the potential of an open platform for participation and developer-friendly attitude to let the community sip the champagne of the social network data. There is a plethora of applications built on Facebook platform and and this might be the key towards monetization. The other key players have also been experimenting with their platforms and business models but there is no killer business model, at least not yet.

Monetizing efforts do ruffle some feathers on the way since it is intertwined with other factors such as privacy, data portability, and experience design. The Facebook's experience design keeps applications' users inside of Facebook but at the same time provide the necessary, or sometimes unnecessary, access to user's data to the application providers. This has set off some debates around privacy concerns. Access to user's data and open architecture is a key to increased adoption that can potentially lead to monetization, but Facebook needs to be careful here not to piss of the users. Compare this with Google few years back where Google made a conscious decision to keep the search results rank clean (do no evil) and that strategy paid off when Google started monetizing via AdSense.

Marketers argue that the spending power of the current demographics of Facebook is not high, so why bother? This is true but don't forget that when these millennial grow up to buy that 60" plasma TV, some companies do want to make sure that they have a brand tattooed in their heads from their previous branding experience on such social networks. As pointed out by many studies, the millennial are not brand loyal and that makes it even more difficult for the marketers . The Facebook is a great strategic brand platform to infuse the brand loyalty into these kids.

Data portability is part of longer term vision for any social network. The applications are constrained inside a social network, but an ability to take the data out in a standard format and mesh it up with an application outside of Facebook has plenty of potential. Leading social and professional network providers have joined the Data Portability Group. Imagine to be able to link your Facebook friends with your LinkedIn contacts and provide a value add on top of that. There are plentiful opportunities for the social network providers to build the partner ecosystem and have the partners access to the data and services in the process of co-innovation. LinkedIn for the longest time resisted providing any APIs and relied on their paid subscription services. LinkedIn has tremendous potential in the data that they posses and standardizing the formats and providing the services has many monetization opportunities. It is good to see that LinkedIn has also joined the Data Portability Group and has also promised to open up APIs. Google's OpenSocial effort, partially opening up Orkut as a sandbox, and social network visualization APIs are also the steps in the right direction.

What I can conclude that the growth of such social networks is in two directions, platform and verticals. As platform becomes more open we can anticipate more participation, larger ecosystem, and service innovation. This should help companies monetize (no, no one has figured out how). The growth in vertical will help spur networks for specific verticals such as employment, classifieds, auction - who knows?

Monetization, experience design, and privacy cannot be separated from one another and few wrong strategic decisions could cause significant damage to the social network providers and their users.

Wednesday, January 9, 2008

Long Nose of Innovation

Innovation is an ongoing process. Bill Buxton likes to call it The Long Nose of Innovation. He describes the phenomenon as the bulk of innovation behind the latest "wow" moment (multi-touch on the iPhone, for example) is also low-amplitude and takes place over a long period—but well before the "new" idea has become generally known, much less reached the tipping point. He has given quite few examples in his post emphasizing that it is all about idea refinement and innovating around the existing technology. It is naive to throw away an idea just because it is old or not "new enough". Few other bloggers have also picked up the story and Techdirt makes an argument that innovation is not a burst of inspiration but merely a process.

Google search and Gmail are the examples that reinforces this phenomenon. When Google launched the search, people said "what, one more search engine?" Gmail was quite late in the email game, but it forced other web-based email providers to shell out extra storage space. Gmail gained significant market share based on the large storage feature and by providing better customer experience.

AJAX is also an example in this direction. The technology support in the browser for AJAX-based applications was available way before the AJAX term was coined or these applications started becoming popular. Gmail heavily used AJAX to innovate around better user experience . So, watch out, what you need is available right around the corner. What you thought was a silly or an old idea may not be silly anymore. IDEO has a "Tech Box" that is a centrally located lending library of innovation elements. Basically, it is a box that contains all kinds of materials, gadgets etc. Many visitors call it a magic box that contains many IDEO innovations, but for IDEO it is a mindset and a physical statement. IDEO team members look into this Tech Box for materials when they are designing or shall I say innovating the next product.

Finding user's right pain-point and provide better experience is a key to this long nose of innovation.

Monday, January 7, 2008

Technology Acquisition List

The PartnerUp has compiled a long list of technology and web acquisitions in 2007. It is not a complete list and does omit many acquisitions but it is an interesting list that shows a clear trend that the smaller acquisitions have been more frequent. I welcome this change since it allows the acquiring companies to try out the innovation at the prototype level, explore synergies, and add value to it, or toss it out. They can refine and iterate on alternatives without worrying upfront about return on the investment. I see a lot of value in big companies using their cash to try out the outside-in-perspectives and not just rely on their internal innovation engine early on.

This is certainly better than buying the "startups" at the later stage and pay through the noses due to the ridiculous artificial valuation that these "startups" get and after the acquisition not being able to figure out what to do. Skype's acquisition by eBay is an example of one of these scenarios where eBay still cannot find the synergies and monetize the acquisition.

Saturday, September 15, 2007

Interview with Craig Mundie

Craig Mundie, Microsoft's technology chief talks about everything from Vista's low sales numbers, cloud computing, competing with Google, Linux desktop, and his Spotwatch in this long interview . No surprises so far.
  • Cloud computing: He believes that the world is not black and white and that people won't ditch their desktop lients anytime soon to completely migrate to software as a service. I think this makes sense. Microsoft is significantly investing into cloud computing via their "live" initiative but the strategy is to complement the on-premise model to achieve the client-server-service deployment model.
  • Vista: He admits that number of Vista copies sold so far is a small fraction of Microsoft's overall customer base. He calls it a cycle of diffusion and exploitation and Vista being in diffusion cycle waiting to be exploited. This is a chicken and egg problem. There are not enough Vista consumers out there and that's why developers are not that excited, but there is not enough incentive to migrate to Vista for consumers unless the development community adopts it and adds value to it.
  • Google: This is my favorite: "Google's existence and success required Microsoft to have been successful previously to create the platform that allowed them to go on and connect people to their search servers.". This is a twisted argument. He makes it clear that it is not only infrastructure play but a combined infrastructure and client play to reach out to the consumers. He is betting on people needing a desktop and other clients to connect to whatever Google offers.
  • Office Open XML Standard: Not there yet, but he promises that it is not far. He says "There are a lot of people who have raised a great many issues which we don't think have a lot of practical merit, but serve the purpose of creating some anxiety during this process." This is a classic standard related problem and people are super cautious when it comes to Microsoft. It is not about technology but how you come clean and make people happy that you are listening to them.

Friday, September 14, 2007

Design thinking and designers

The conversation with Brandon Schauer, design strategist at Adaptive Path, about design thinking is worth reading. Brandon talks about topics such as critical thinking and design thinking, design attitude versus decision attitude, and the importance of business fluency amongst designers.

I agree that for a business problem, you do want to apply design thinking to explore as many alternatives as you can , but you do want to critically think through all the alternatives before you reach a solution and keep your stakeholders informed about your decisions. Not only business fluency is critical to do this but a designer needs to have empathy for the stakeholders as well. Traditional ethnography techniques such as contextual inquiry can be used to understand user's goals and aspirations, but the designers need to go a step further and understand their stakeholders better and for that the designers need to acquire skills in the business and strategy area.

Sunday, September 9, 2007

The eBay way to keep infrastructure architecture nimble

eBay has come a long way from the infrastructure architecture perspective from a system that didn't have any database to the latest Web 2.0 platform that supports millions of concurrent listings. An interview with eBay's V.P of systems and architecture, James Barrese, The eBay way describes this journey well. I liked the summary of the post:

"Innovating for a community of our size and maintaining the reliability that's expected is challenging, to say the least. Our business and IT leaders understand that to build a platform strategy, we must continue to create more infrastructure, and separate the infrastructure from our applications so we can remain nimble as a business. Despite the complexity, it's critical that IT is transparent to our internal business customers and that we don't burden our business units or our 233 million registered users with worries about availability, reliability, scalability, and security. That has to be woven into our day-to-day process. And it's what the millions of customers who make their living on eBay every day are counting on us to do."

eBay's strategy to focus on identifying the pain points early on and solving those problems first and keep the infrastructure nimble to adapt to growth has paid off. eBay focused on an automated process to roll out the weekly builds into their production system and tracking down the code change that could have destabilized a certain set of features. The most difficult aspect of sustaining engineering is to isolate the change that is causing an error; fixing the error once the root cause is known is relatively easy most of the times. eBay also embraces the fact that if you want to roll out changes quickly, the limited QA efforts, automated or otherwise, are not going to guarantee that there won't be any errors. Anticipating errors and have a quick plan to fix it is a smart strategy.

If you read the post closely you will observe that all the efforts seem to be related to the infrastructure architecture such as high availability, change management, security, third-party API, concurrency etc. ebay did not get distracted by the Web 2.0 bandwagon early on and instead focused on platform strategy to support their core business. This is a lesson that many organizations could probably learn that be nimble and do what your business needs and don't get distracted by disruptive changes, instead embrace them slowly. Users will forgive you if your web site doesn't have round corners and does not do AJAX, but they won't forgive you if they could not drum up their bid and lost the auction because the web site was slow or was not available.

One of the challenges eBay faced was lack of any good industry practices for similar kind of requirements since eBay was unique in a way it grew exponentially and had to keep changing their infrastructure based on what they think is the right way to it. eBay is still working on grid infrastructure that could standardize some of their infrastructure and service delivery platform architecture. This would certainly alleviate some of the pains that they have from their proprietary infrastructure and could potentially become the de facto best practices for the entire industry to achieve the best on-demand user experience.

eBay kept it simple - a small list of trusted suppliers, infrastructure that can grow with users, and a good set of third party API and services to complete the ecosystem to empower users to get the maximum juice out of their platform. That's the eBay way!

Thursday, September 6, 2007

Are RDBMS obsolete?

Today Slashdot has picked up a storycolumn-oriented databases. The story claims that one size fits all approach does not work well for the current data warehousing requirements and that the organizations should explore other options beyond legacy RDBMS. The post says "Hence, my prediction is that column stores will take over the warehouse market over time, completely displacing row stores."

The fundamental assumption here is that somehow the data warehousing solutions are drastically different than OLTP ones and that's why has different storage, or I should say access, needs. What the post is missing is that many modern OLTP applications require real time analytics side-by-side and cannot really depend upon a separate data warehousing. The technology such as in-memory databases and materialized views that run on top of OLTP RDBMS make it feasible for an application provider to just have one hybrid system - OLTP or data warehousing, whatever you want to call it. This was obviously not the case few years back and you could get shot if you propose to run your analytics on a production (OLTP) database. I do believe that there is a need for special purpose databases that are different in architecture for very specific kind of applications but RDBMS is far from being obsolete. I heard the similar arguments in the past when object oriented database vendors claimed that RDBMS would become obsolete when people would switch over to object-oriented programming languages. Deja vu all over again!