There is an online market for so many intangible goods these days that it should come as no surprise that there is a market for Facebook “Likes” — the little thumbs-up rating that accompanies so many products and services we see on a daily basis.
For $75, a marketing company will sell you 1,000 Facebook “Likes,” according to NPR’s Planet Money. Only the marketing company does not supply the “likes” but works as a broker between real individuals who are willing to sell their online preferences to your product for very small sums of money — ten cents a “like” — and those who wish to artificially inflate their prestige.
Ten cents may not seem like a lot of money, but there is a huge workforce of individuals willing to be employed to undertake low-skilled, repetitive online work for pennies a task, as evidenced by mature markets like Amazon’s Mechanical Turk. Global outsourcing has never been easier.
Competing with this world market of prestige workers — and driving down prices even lower — are legions of software robots willing to do the work of humans. According to the NPR story, Facebook is retaliating with its own army of “artificial bot hunters” that attempt to route out these robot accounts and kick them off the network.
The artificial trust market is not new and is found in environments where online trust is important, such as purchasing an antique love seat from a complete stranger on eBay, finding a reputable bed and breakfast in rural Ireland, selecting a new e-book from Amazon, or choosing an app from the Apple Store. When in doubt, our tendency is to turn to the wisdom of the crowds because we believe that these ratings are accurate evaluations generated by honest individuals and based on real experiences.
Trust — or at least consensus — works the same way in scientific publication through the accumulation of citations, only the barriers to participate in this market are much, much higher. To cast your votes, you need to publish a paper that is indexed by Thomson Reuters’ Web of Science (or alternatively, Elsevier’s Scopus). Like Facebook, Thomson Reuters does not take kindly with citation manipulation and will delist a journal when it exhibits forms of citation manipulation such as systemic self-citation or, more recently, through the formation of citation cartels.
Some new forms of alt-metrics are more difficult to detect when gaming is taking place. Measuring impact based on article downloads, for instance, requires one to trust that the publisher has properly processed proprietary log files and reported the statistics truthfully. Furthermore, as David Crotty expressed recently, downloads may be measuring something very different — popularity, rather than trust or prestige. It is not difficult to ask one’s closest colleagues, friends and family to provide support for a newly published paper in the form of laudatory or perfunctory comments left on a journal site, or to tweet (or retweet) links to one’s own paper. Those wishing to count scientific impact based on tweets should be aware that you can purchase thousands of them for very little money. According to one advertisement, $5 will purchase a tweet to 150,000+ followers. The ad does not mention whether these followers are humans or robots.
As a reputation market has spawned companies dealing in the buying and selling of Facebook “likes,” we have not yet witnessed a similar market for citations. Arjen Wals, professor at Wageningen University in the Netherlands, imagines an eBay-like market (hBay or PleaseCiteMe.com) where authors would offer perfunctory citation services in their manuscript for a fee, or at least for a reciprocal exchange. Luckily, this market doesn’t exist, yet, although I cannot preclude the existence of an informal shadow market an engineering colleague of mine is convinced exists in China.
While we often hear that transparency (or “sunlight”) is the best solution for routing out corruption in a trust market, it must be accompanied by accountability. Catching someone for attempting to game a trust market is only half of the solution. Finding an appropriate punishment that fits the deed is the more difficult part.
Trust requires both transparency and accountability. I’ll give that a thumbs-up.