There is a part of the Internet that is closed to public eyes, available only to subscribers and registered site users — the "deep web." Media outlets such as the Wall Street Journal have made moves toward a new revenue model recently by placing content behind subscriber-only barriers, a move that a recent Technorati report points out is hurting the site’s reputation as an influential news outlet:
While the WSJ has begun to offer some content outside of its subscriber-only site, the policy is clearly costing them some influence and attention in the blogosphere, as bloggers find it difficult to link to articles in the subscriber-only sections.
It brings up an interesting question about this deep web (all the pages that sit behind layers of security, or registered access). Given the dominance of search engines as an access point to all information and the common inability to index pages in the deep web, is this situation accelerating the decline in relevance of the major news outlets? And if so, how can they get around this, while still realizing the financial/audience segmentation benefits of having users register or subscribe for content?
One answer may come from MarketingSherpa.com, which has developed a very effective way of delivering their content. New stories are emailed to registered site users along with expiration dates to view the content for free. After this date, the content must be purchased in order to view. This model works because it drives me to read their new content right away … and also led me more than once to purchase archived content (because I have already read it and know it is relevant). Seems like a a model some of the larger media outlets should pay attention to …