Sometimes, the most salutary moments come when someone outside a situation makes observations. Recently, Craig Newmark, founder of Craigslist, cogitated on the state of news media in a way that’s instructive for both news organizations and publishers overall. To begin:
As a nerd, I’m excited by the new tech, particularly mobile, including new display systems and pervasive connectivity. However, the tech is secondary, not nearly as important as repairing some current issues with trust and curation.
This really hits the nail on the head when it comes to the social obligations information purveyors have. But there are major twists introduced by a completely revolutionary information exchange landscape.
A recent example of this was given at the O’Reilly Tools of Change conference I attended in Frankfurt. Matt McCalister from the Guardian told about a protest gone awry, with a protester killed and suspicion of police brutality. No evidence was forthcoming, and the police were denying any involvement. A journalist from the Guardian made a plea to readers for evidence, telling them they could trust him to do the right thing with the information, and soon enough, a video showing the police beating surfaced, leading to a formal investigation.
This is the trust issue, in a nutshell, but it’s much more — it’s about trust being used differently, about trust in an information-abundance world with an increasingly robust information infrastructure, allowing it all to play out differently.
Curation is another matter. We live in the age when people expect news to find them, so putting it into their workflows and daily experiences is different than it was when a small set of packages (newspaper, news programs) were where people sought the news. Curators have to place the news differently and probably more repetitively than before, or the news doesn’t reach its audience. Newmark nails the issue again when reflecting on the coverage of the financial meltdown and the pre-Iraq invasion “intelligence”:
Good reporters told us that something was amiss in both situations, and we did see some really good journalism in both cases. However, the really good journalism was buried, not curated into the front pages, and then, infrequently if at all repeated. . . . The new model for news curation and selection, I feel, will be a balance of professional editing and collaborative news filtering. In one incarnation, news organizations will look at feeds from highly respected news fans, and that will drive stories that are featured more prominently.
The curation opportunity is a major one for publishers of all types, but thinking it ends with the publication of a traditional book or journal is a losing proposition. And thinking that being “the best” is what creates interest underestimates the repetition required and dismisses major opportunities for connecting with a professional audience seeking novelty.
I was recently talking with a couple of researchers who observed that the most interesting science isn’t usually in the big name journals, but rather in the mid-tier or even lower-tier publications where really radical thinking and unusual results find their way into the literature. The big name journals are publishing on popular topics well along in the scientific literature. They’re important, but less interesting.
Curating out of the middle is a major opportunity for publishers and others in the information landscape. Repetition, presentation, prominence, and context all provide curatorial power.
The solutions to the information puzzles we’re being presented with may be as simple as Newmark is suggesting — maintain and increase trust, and curate information well, frequently, and meaningfully:
The successful news organization of the future will pursue models for news curation/selection which [are hybrids] of professional editing and collaboration among talented consumers.
The same could be said for journals and scholarly texts of all types. It’s really that simple.
Oh, and in the meantime, go mobile, social, and digital, all simultaneously.
9 Thoughts on "The Future of Publishing? Trust and Curation, Says the Founder of Craigslist"
Like what you are saying about needing both important and interesting elements in publishing. This is something we’re doing at http://f1000.com, identifying the science papers in the lesser-known journals that researchers need to know about.
The key to getting the message out is also, as you mention at the end, using social media and making the relevant sites available for smartphones so people can get the right information anywhere, anytime
I think you need to increase your sample size. Of the many researchers I know, very few would say the most interesting science is not in the big journals (and those that do claim this generally reveal more about their own bias than anything else). The reason for this is the top journals have large editorial staffs paid to seek out interesting work – essentially curation in advance.
What all would admit, however, is that there are occasional interesting articles that do not make it into the top journals – for a variety of reasons. This was the premise for Faculty of 1000. But, while this is a very nice idea in concept, one should be aware many researchers remain unconvinced – amid rumors that many of the ‘faculty’ actually do rather little.
Note also that sections in journals devoted to ‘interesting papers elsewhere’ frequently point to the same papers in the same big 3/4 journals. This might just be a self-reinforcing hype cycle. Alternatively, it might just be because the peak journals are pretty good at attracting the interesting stuff and there’s not a huge amount worth prospecting for elsewhere.
I can categorically state that there aren’t any scientists I know who would deliberately publish in a lower-tier journal rather than a top-tier journal because of a perception that the top-tier journals are less interesting. I can also make a pretty solid assumption that there aren’t any journal editors out there looking to lower the reputation of their journals in order to attract more interesting papers.
Well, that’s just getting it backwards. I never said that journals should purposely become mid-tier to be viewed as more interesting. I said a scientist I spoke with found the methods, hypotheses, and preliminary results in the mid- and lower-tier journals to be more interesting. That has no effect on the sociology of publishing status. In fact, this person was clearly browsing these tiers looking for something to convert into a breakthrough for a top-tier journal. But for inspiration and lateral thinking, I can see why you might want to be aware of some ideas there. For the purpose of curation, it seems like an opportunity — do you really think the top-tier journals publish all the interesting science? That’s a batting average I’d be envious of.
That’s quite a bit different from a blanket statement that “the most interesting science isn’t usually in the big name journals”. Sure, the bigger journals are looking for complete, meaningful results as opposed to preliminary experiments that might lead to something. And I’d argue that a big, groundbreaking paper that opens up new concepts and pathways is probably more likely to lead to interesting experiments for other researchers than a lesser, preliminary report (though perhaps the latter might provide a faster route to meaningful results if one is willing to poach someone else’s project).
I’d also argue that there’s no clear correlation between journal level and methods used, although in the lower-tier journals, there’s often more room for description of the methods (one of the driving forces for so many top tier publishers to recently start methods journals and add large amounts of supplemental data and description).
As for hypotheses, I guess that’s in the eye of the beholder.
Curation is, and for the foreseeable future will be, the only way to make sense of the massive trove of information, and disinformation, on the Internet. At findingDulcinea, we own the URL “CuratorOfTheInternet,” because we have long seen the need for someone to play this role. Internet users want to view content from a broad array of sources, and they want someone to put it in context for them to help them make sense of it all. Search engines have made us stupid, by causing us to skim across the surface of the Web without ever fully understanding anything we read; curators are the antidote.