Crowdsourcing translation services — what an intriguing, even shocking concept! Subscribing to the philosophy of grassroots movements such as open source software, Wikipedia and even popular revolts like the Arab Spring, crowdsourcing calls the value of traditional expert translation into question.
The last time this happened, machine translation (MT) made an aggressive play for market share in an industry dominated by human translation (HT). In the bruising contest that ensued, new niches were carved out for the respective combatants, but it was a third party, computer-aided translation (CAT), that emerged strongest.
CAT is the true post-human face of modern translation, combining efficient computing techniques with human quality and process to deliver infinitely scalable, top-quality, streamlined translations that MT and HT cannot compete with.
The madding crowd
Now, sweeping change is upon us again. Venture capitalists are apparently falling over themselves to fund platforms or services based on community translations. And it’s easy to see why. Community translation purports to be cheap or free, quick and to deliver results.
Amara, a crowd-sourced subtitle translation service for Youtube, received a US$1-million grant from, among others, Mozilla, and has shown it can translate videos into 20 languages in 24 hours.
Smartling, a technology platform for recruiting community translators, has attracted US$24-million in VC funding. It is also the translation platform for non-profit TED of TED Talks fame.
If amateurs can produce original content, and this can be ‘perfected’ over time by a process of peer review, are CAT techniques still relevant? Let’s examine the lessons to be learnt or inferred from crowdsourcing efforts.
When Macromedia launched Dreamweaver in the US in 1997, it was blown away to receive a fully-translated version of the user interface from a French fan of its products a year later.
But think twice before leaping into China without a localised product, collateral or website. It stands to reason that a new product has more excitement value than further updates, and any crowdsourcing strategy would have to incorporate a plan B down the line.
Crowdsourcing has further worked better in the case of Mozilla, whose Firefox browser is offered in 70 completed languages. Like Dreamweaver’s fans, community translators of Mozilla Firefox are dyed-in-the-wool tech enthusiasts who know browser terms in their own language and can invent appropriate words for newly invented concepts.
Commercial vs charitable
But unlike Dreamweaver, the objectives of Mozilla are not commercial but developmental, inspiring more devotion as a consequence.
Firefox’s French community, it transpired, are just as passionate as Adobe’s, routinely responding to new versions with swift translations. Even when you’re changing the world, some communities are bigger enthusiasts that others, and crowdsourcing as a community engagement tool has its productive limits.
Horses for courses
As proved in both cases, enthusiasts can offer greater technical accuracy in certain cases than professional translators. But this disregards the fact that professional translators do a great deal of research (within reasonable limits), and power users don’t necessarily make good translators.
Translation, properly approached to ensure quality and cost-saving efficiency, is not a middling undertaking, and regardless of the claims, does not scale. Soliciting translations requires infrastructure — namely, a self-contained translation kit that allows the owner of the original text to receive translations back and debug it for coding or formatting errors.
In addition, it requires a project management competency that not even big companies care to maintain. Compare this to supporting an in-house marketing competency (in-sourcing) or project-managing a crowd of suppliers delivering various components of a marketing solution to augment in-house skills. Because you’re paying crowdsourcing rates (little or nothing), your suppliers are unlikely to take on very much work, which swells the supplier base and adds to the management burden.
Nor are they likely to be very good. Poor quality translations place a high burden on peer reviewers, which is a passion killer (other than perhaps in religious communities such as core Linux contributors). In a well-managed process (outsourcing), peer review would happen much earlier. So, considering the vendor and outcomes management overhead of the crowd, why do it?
On closer examination, the supposed benefits of crowdsourcing – quality, low cost and speed of results – can be illusory.
Many cases exist in which, to a greater or lesser degree, communities should work in tandem with language service providers (LSP) to provide quality, scale assurance and project management.
As with each new game-changing event on the tech landscape, it re-sets the market, but always less dramatically than first thought.
Memeburn focuses on everything digital in the emerging markets sphere. More about us here