World of Content

Tuesday, November 21, 2006

Wanted: New Management

In the wake of the sale and break-up of media giant Knight Ridder, the recent auction and sales of Clear Channel Communications and Readers Digest to private equity groups, and a similar anticipated fate for the venerable Tribune Company, it is clear that investors are dissatisfied with the performance of many “old” media companies, and that they believe the valuable content assets of these companies can be managed better under new ownership. In a parallel universe, Brad Garlinghouse, Yahoo! Senior Vice President, expressed his frustrations with the management of the quintessential New Media company, for its lack of “a clear and focused vision” for the company.

The content business is starting to look like the airline business, where the largest players, in spite of having a highly valued product and well-established brands, have struggled to survive. In the case of the airlines, it took a small upstart (Southwest) to demonstrate that cost cutting and innovative management could change the game and create sustainable shareholder return. Like with the airlines, much of the immediate focus is on cutting costs. However, cost cutting alone is unlikely to be successful, unless these companies also figure out new ways to capture value from their most loyal users. Southwest did this in the airline industry by focusing on what was most important – like low prices, convenient schedules, and dependable performance – while jettisoning costly features that, in the end, customers did not consider so important.

Will the new owners and managers of content companies have the insight and understanding of their audiences that will enable them to make the right calls through the tough times ahead?

Wednesday, November 15, 2006

The New BoardReader Launches

Ever wonder if anybody else has faced the same problem as you have, like how to get unstuck from level 4 in your favorite video game, or how to choose a sax mouthpiece that will help you sound like John Coltrane? Finding a forum with other people who share your interests, no matter how specialized or obscure those interests might be, just got a whole lot easier with the release of the new site. BoardReader is the largest aggregator and indexer of message board data - monitoring and indexing tens of thousands of message boards, a number that is increasing by about 100 per day. They index millions of posts per day, equally distributed between English and non-English language items. The company has been doing this since 2001, so they have built up a significant expertise in parsing and normalizing content generated by the many different message board hosting platforms in use. [Fair disclosure: I have been working with BoardReader to help them develop new business lines and expand their market presence.] With the release of this new, free search tool, BoardReader has emerged as the essential complement to Google and Technorati for uncovering unique content in the “deep” or “hidden” web.

Along with the ability to find boards that you may want to join or monitor, the new site has some cool analytic tools that can help you discover trends in what the millions of message board enthusiasts are discussing. Want to know the most popular online videos that message boards are linking to? How about a graph showing what domains are influencing message board communities? There will soon be a tool that will show what news stories are being discussed, along with a graphical representation of the “buzz” around a particular story over time. Such sophisticated and powerful analytical tools – similar to those that are used on “surface web” content by expensive analytical applications or services – have never before been accessible for message board data.

Check it out, and let me know what you think by emailing me or posting a comment here.

Tuesday, November 14, 2006

“Web 3.0” Makes its Public Debut

Don’t be misled by the title of this post: the quotation marks are the most important part. Sunday’s New York Times featured an article by John Markoff where he introduces the term “Web 3.0” for the first time in a mainstream media publication. Web 3.0 (no quotes this time) and its closely-related descriptor, “The Semantic Web,” have been discussed in geek circles and tech blogs for some time, but it appears that we have clearly reached a milestone with the NYT article.

Turning away from the terminology debate for a moment, it is worthwhile to take a look around at ways that this next-generation approach to finding information is already making its way into applications and venture capital portfolios. At its heart, the semantic web is another step in the evolution of content mark-up languages (XML) that enable computers to treat textual data (technically “unstructured” data, versus “structured” data that fits conveniently into rows and columns) in more intelligent ways. The enrichment of content – using free-form tags (or “folksonomies”) or structured categorization systems (taxonomies) has its roots in information science going back decades, and even further back to the beginning of the 20th century, with the introduction of the Dewey Decimal System in libraries. What is new and exciting about the budding Web 3.0 era is the convergence of text mining and artificial intelligence, which enables computers to glean the meanings of words in context, and new applications that enable the application of human intelligence (acting individually and in groups) to this process. Whether through collaborative applications, like Wikis, through “voting systems” like, or through self-defined communities of interest, like my former Biz360 boss You Mon Tsang’s new company, Boxxet – this combination of computing power and the inherent “wisdom of crowds” is already having an impact.

The company discussed in the Markoff article is a start-up called Radar Networks, founded by web visionary Nova Spivak, who founded EarthWeb and took it public in 1998. While his new company is still operating in stealth mode, the interview hints at the direction this is going. To get accurate results (i.e. results that would be plausible using average common sense) from any mathematical algorithms, you need masses of data. Text mining can “discover” concepts and trends from even a small corpus of data, but the results are often strange or laughable unless this process has been performed across hundreds or thousands of documents. The same thing is true about algorithms based on “crowd” data or behavior; witness how Google rankings can be manipulated or distorted by the intentional actions of a small minority of users. It will be interesting to see how Radar Networks and other companies looking to commercialize the semantic web will deal with this problem. The semantic web will inevitably become reality, however, enabled by the inexorable growth of computing power as more and more of us participate in the online world. Let’s stay tuned.

Monday, November 13, 2006

Digital Fish Wrap?

Peter Scheer, Executive Director of the California First Amendment Coalition, authored a provocative opinion piece in yesterday’s San Francisco Chronicle, suggesting that newspapers should raise the value of subscription services by enforcing an industry-wide 24-hour embargo on news content before it is made accessible through the free portals and search engines. Scheer says that even the most successful newspapers have trouble selling online advertising that covers even 10 percent of what they have lost in print advertising, concluding that only by cutting off the portals from their sources of traffic-driving fresh news can newspapers win over subscribers to their own sites. Even ignoring the remoteness of possibility that large and small news organizations could ever cooperate on the scale that would be necessary to make this effective, or the more likely scenario that an industry-wide collusion that forced consumers to spend more for news would be declared a violation of antitrust laws, it still sounds like a bad approach to the problem.

A major flaw in Mr. Scheer’s argument is his assumption of an online business environment that still follows the rules of the pre-dot bomb era, back when page views and banner ads ruled the day, and site owners only made money by being destinations for users. That vision of the Internet was no more than an attempt to simply duplicate print ads onto a web page instead of a paper one – it never took into account the dynamic nature of the web. It took Google to recognize the possibilities that could be created when every search could be turned into a new opportunity for serving highly targeted ads. He ignores the revenue-generating possibilities of newer technologies like RSS and mash-ups and social tools that create new opportunities for engagement with users outside of the subscription model. What we are seeing today is the concentration of these tools in the hands of large technology companies, so they are the ones who happen to be profiting at the moment. However, most of these are open source technologies, and there is no reason why news organizations, with their well-established brands and their built in, loyal, local audiences could not use some of these tools to turn themselves into successful portals. The great majority of web users will probably never become subscribers, but they will continue to use the Web in more ways, for information, entertainment, and community. Newspapers would be kidding themselves if they thought that they could truly turn that tide by erecting new walls around fresh content. They would be much better served to follow the tide, i.e. learn what this new generation of web users is doing, and keep coming up with ways to make all of their content more engaging and more valuable.

Tuesday, November 07, 2006

News Crowdsourcing Goes Mainstream: Caveat Emptor

Gannett Corp. is restructuring the news-gathering organizations at their more than 90 papers, including flagship USA Today, in order to leverage the investigative reporting of readers. Wired News reported on Friday that Gannett has rechristened its newsrooms as “information centers” that will use its websites to solicit user-generated content, including whistleblower tips and investigative details from local readers. It’s a fascinating development that mirrors what’s going on in many other areas of media, like photography and music. It makes great sense to get the masses of readers with time on their hands to do some of the “heavy lifting” that is required for good investigative journalism. Of course, Gannett is already getting criticized by media professionals, who are saying they are just cutting expenses by substituting amateur for professional content. However, as one of the foremost promoters of “pop” news, Gannett is probably already immunized from criticism that their news coverage will now become less serious.

A more serious concern, however, should be over how this user-generated content will be managed. As recent election campaigns have shown, the web is full of false "news" perpetrated by unscrupulous partisans, and user-generated articles in Wikipedia are frequently manipulated for political purposes. A recent article in the New Yorker documents some of the “Dirty WikiTricks” that are now becoming a feature of partisan politics. While it’s unlikely that anyone will be successful in suing Wikipedia for defamation, the deep-pocket Gannett Corporation may not be so lucky. So Gannett would be well advised to not be so quick to lay off those investigative journalists. While their job descriptions may be changing, they will surely still be needed.

Monday, November 06, 2006

Another Corporate “Flogging” Revelation: PR Pros Continue to Struggle with Social Media

Add McDonald’s to the list of corporate PR departments that have been exposed recently for using supposed consumer blogs as shills. The latest incident, described by Tom Siebert in today’s Online Media Daily, so soon on the heels of the Edelman/Walmart fiasco, further underlines the risks that PR pros take when they try to co-opt the blogging movement. Hiring professional journalists to ghostwrite supposedly amateur blogs, posting professionally produced (but made to look amateur) videos to YouTube - have all become standard practice in the “new PR”. Before investing in more bright ideas like these, it might just be a good idea for marketers to ask their friends at Walmart and McDonald’s, “How did that blog strategy work out for you?” No matter how cynical and jaded these marketing hacks may be, it’s hard to see how the exposure of these practices could have positively influenced the already-damaged public esteem and credibility of these huge consumer brands.

With every new technology or new idea that achieves popularity, there will always be people who will believe that the changes are so revolutionary that the old rules no longer apply. (Sort of like the investment community during the bubble years, when supposedly respectable analysts pumped up companies that they couldn’t understand and that had slim prospects of ever returning a profit.) If marketers mistake the rise of social media as an excuse to ignore the basic rules of journalistic integrity and transparency, such mistakes will not be forgotten or forgiven quickly. To coin a phrase, the public may be gullible, but they are not stupid.

Friday, November 03, 2006

Bottlenecks in the Blogosphere – “A” List Bloggers Under Siege by Marketers

Top tech bloggers Robert Scoble and Michael Arrington got a little testy this week, reflecting some of the pressures that “A” list bloggers have come to bear. Michael, owner of the TechCrunch site, was reacting to criticism that his writings about start-up companies and industry gossip might be too self-serving. Robert complained about being overwhelmed with a flood of email pitches from tech PR people who are trying to get him to write about their companies. The common thread revealed here is the segmentation of the blogosphere which has created a very small elite class of bloggers who, whether they like it or not, are being treated more like mainstream media pundits. A major problem for these guys, and for the PR folks who are now diligently tracking and targeting them, is that they don’t have the infrastructure and staff resources of mainstream media to support them or act as a buffer between them and marketers. A Wall Street Journal profile of Arrington described the dilemma of the elite tech blogger. Even though he reports news and does product reviews, as a part-time journalist who is also an entrepreneur, there is no traditional (albeit usually mythical) “Chinese Wall” between editorial and marketing to fall back on. You get the impression that these guys are enjoying the spotlight (which they undoubtedly deserve) but they don't quite know how to deal with the responsibilities that go with it.

This phenomenon is just another example of how much PR professionals still have to figure out about social media. Like generals who are always fighting the last war, most PR pros are acting like the old rules are still in effect – assuming that most of the influence is still wielded by an elite few, and that the “wisdom of the crowd” is primarily the voice of just a few of its members. With the heavy influencers themselves saying, “Back off,” effective PR in the Web 2.0 world is going to require a lot more imagination than that.