Conde Nast, the owner of Wired magazine, has just bought the user filtered news site Reddit, according to TechCrunch. After AOL set up Netscape beta as a user filtered news site, this is the second signficant foray of large media into this space. As described in our Future of Media Report 2006, user filtering is the Web 2.0/ participative analogue of editorial. It is not just users creating content, but also editing. Chris Anderson, editor of Wired magazine, is very excited about the acquisition, which plays perfectly to his long tail themes. There will be lots more play in this space yet.
Tim Berners-Lee, the creator of the World Wide Web, has just announced that the World Wide Web Consortium (W3C) will establish a completely new working group to work on the development of HTML, in his words “reinventing HTML”. This bold move has been prompted by the too-slow shift from a mark-up system which works reasonably well, but is still flawed. Attempts to evolve HTML to a “well-formed” structure that draws on the power of XML have been stymied by most people’s satisfaction with the current standard.
In my book Living Networks I created a chart showing what I described as the gradual progress towards open, accepted standards – see below. On one level, HTML is a poster child for an open, accepted standard, shown on the upper right of the diagram. There are no competitors for HTML – it is fully accepted as the standard for representing data on the Internet, and the W3C, for all the critcism it garners, genuinely attempts to represent and incorporate the views of all stakeholders. Yet, as anyone who has been involved in a standards committee knows, maintaining and developing an existing standard, particularly one with the impact of HTML, is no easy task, with ample scope for personalities and politics, which will certainly have to be addressed in this case. The developer community seems split between the positive and enthusiastic on one side, and skeptical on the other, with also other interesting analysis.
The gradual shift to open, accepted standards
I have been applying scenario planning with clients for the last decade across a variety of industries and environments, including the future of financial services, technology, capital markets, risk management, construction, Internet, Asia, and far more. As I wrote back in 1998 in an article on scenario planning in portfolio and risk management, “The greater the degree of uncertainty and unpredictability, the greater the value of using multiple scenarios rather than forecasts.”
Knowledge@Wharton has just published an interesting discussion on Will a New Theory Help Firms to Manage in a ‘Flat’ World? (registration required), which looks at how executives can make sense of the rapidly changing environment. Paul Kleindorfer, a Wharton professor of operations and information management, made this very interesting comment:
In the past six months, there has been an upsurge in the number of companies coming through INSEAD [the European institute for management education] looking for assistance in scenario planning and scanning, or determining the signposts that suggest which scenario or scenarios should be the focus. Some companies — like Nestle, Unilever and Procter & Gamble — have been doing some scenario planning, but it’s been directed toward competition and technology. So these and other companies were completely blindsided by the recent increase in mineral oils — which was spurred by a law in Germany requiring power plants there to burn 10% bio-fuel by 2010 –and its impact on the vegetable oils and other ingredients they purchase for their products.
These sorts of commodity risks have escaped the scrutiny of many companies. Now they see a single government make a decision and it throws the profitability of an important ingredient out the window. So scenario planning and scanning, together with strategic modeling, intelligence and other issues, are really beginning to take on a much larger significance than before. It used to be about markets, technology and competitors, but now there’s a much richer texture.
Over the last decade I have certainly seen how the cycle of interest in scenario planning from major organizations has tracked the degree of perceived uncertainty in the business environment. The scope of the imponderable now, ranging from geopolitics to consumer behavior, overlaid on the necessity for long-term strategic thinking, means that scenario-based approaches are again on the rise. As suggested by Kleindorfer’s comments, I have seen many traditional consulting firms do scenario planning in such a reductionist manner that the scenarios cover only part of the scope of uncertainty, which entirely defeats the purpose. Today more than ever, there is massive value in engaging in scenario planning for long-term strategy development, in a way that really does uncover assumption and open out thinking across the organization.
I’m just back from a quick trip to South Africa, where I am working with a large organization to help develop their long-term corporate strategy. One of the many insights on this fascinating trip was how mobiles are leapfrogging the Internet across Africa. Across the continent, and even in relatively developed South Africa, fixed broadband Internet is difficult to access, expensive, and unreliable. Mobiles have already leapfrogged fixed line telephony across the continent. Research done last year indicated that 85% of small businesses run by black people in South Africa rely solely on mobile phones, and 97% of people in Tanzania have access to a mobile phone compared to 28% for fixed lines. Now phone companies are taking the opportunity to offer mobile data services and internet access. As a GSM-based continent, GPRS and HSDPA (which is very high speed – sometimes called 3.5G) are the core data platforms. MTN, a South African telecoms company with 28 million subscribers across 10 African countries, is already broadly offering data services, with of course little competition from other platforms. Most phones have GPRS capabilities, making data access a core functionality available to mobile users. MTN is using HSDPA to help Internet cafes to set up in townships where fixed Internet access is just a dream. Interestingly, the BBC recently reported that 61% of its international WAP users are in Nigeria, and 19% in South Africa. Of course this partly reflects that WAP is not used greatly in Europe, however it certainly shows that it is a viable technology given internet access being primarily from mobile phones in an emerging economy. Thus in Africa, digital content providers must focus on mobile delivery if they want to access anyone other than a handful of the elite who live in select areas and can afford fixed broadband. Despite its enormous economic and other problems, Africa is becoming a showcase for the potential of the mobile internet.
I have previously written about blogging and Regulation FD, which is a Securities and Exchange Commission (SEC) regulation that requires egalitarian dissemination of substantive news that could affect the share price. On the face of it, blogs and RSS are the perfect way to allow perfectly equal access to news by everyone. Yet this is not allowed by current SEC regulations. So Jonathan Schwartz, CEO of Sun Microsystems, has written to Christopher Cox, the chaiman of the SEC, to ask him to change the regulations to allow blogs and similar tools to be used for disclosure of substantive news. He says that previous conversations with Cox indicate this will be heard with receptive ears. Schwartz has of course disclosed this on this own blog, together with the full letter to Cox. If this change is approved, this will be an enormous boost for blogs, because it will mean investors and intermediaries will have to monitor the blogs of public company officers, and it will allow company directors to disclose substantive information on their blogs, in turn reducing the governance issues of corporate blogging. It makes all the sense in the world to use the power of RSS to disseminate information – this in fact would be a significant improvement to current mechanisms – so with just a tiny variation in the regulations on what are appropriate ways of disseminating corporate information, blogging could become quite a different world, with the development a thoroughly corporate segment of the blogosphere focused on egalitarian diffusion of investor information, and by-the-by, resulting a deeper and broader view of public company activities, and better informed investors.
Netflix has just announced a $1 million prize to whoever can improve the accuracy of their movie recommendation engine. To enable people to design an improved recommendation engine, they’ve provided their users’ ratings of 100 million movies, an extremely valuable database. This harkens back to Canadian gold mining company Goldcorp’s initiative, whereby they publicly released the geological data on their properties, and set up a competition with prizes for whoever could give them the best recommendations on where to dig for gold. Other open innovation initiatives such as Innocentive match a whole series of people looking for innovation, again providing pre-specified rewards for meeting specific parameters. Some note that the prize will mean a lot of people work for free, and it’s arguable that if you can indeed do better than the other competitors, you’ll be able to make more than $1 million from it commercially anyway. The size of the prize indicates the value in enhancing the accuracy of collaborative filtering, as I’ve written about many times before. If Netflix can more accurately recommend a movie to its customers, the more likely they will stay with Netflix. For companies with other business models, greater accuracy directly impacts sales and revenue. More and more energy and resources will be going into this space. Netflix has chosen to combine two of my passions – open innovation and collaborative filtering – so I will be very interested to see the results from this. Details of the prize are at netflixprize.com, which will provide a progress chart on how the competing teams are doing.