Phil Sim

Web, media, PR and… footy

What Google can do to stop plagiarism and save journalism

When people lament the decay of journalism in the online age, you’ll almost certainly hear Google thrown in as a primary contributor. Critics cite the echo-chamber effect of Google News and the need for journalists to spit copy out at a million miles-an-hour to keep up in the Google-age as factors driving the craft of journalism into the ground.

Plagiarism also will often rear its ugly head. The problem is getting worse and worse. Yesterday, on our MediaConnect/ITJourno site we wrote how SMH.com.au journalist Asher Moses complained about a piece of his being pretty much pilfered by the Daily Mail. This piece about a local newspaper in the US made up almost entirely of plagiarised content ran prominently on online magazine Slate along with this blog on the Publish2 blog. Now plagiarism pre-dates Google by centuries but it’s probably a fair argument to suggest that the rise of Google-driven online media models has intensified the problem by a degree of magnitude.

Google recently made a move to lower its ranking given to duplicated content so if you use wire copy its less likely to show up prominently in Google News and on search engines. Surely that would act as a demotivator to plagiarise you might say? No, it just forces unethical outlets to go to a little more effort to rework the plagiarised content so its not identical to the original. The Daily Mail story was a perfect example of a story which has been reworked but the fact of the matter remains that it has taken slabs of content without any attribution.

Google has become such a powerful force that it heavily influences, some might say dictates, the directions and strategies of most online media companies. If Google changes its ranking algorithms, SEO gurus all over the world scramble to react.

So what would happen if Google used its power to remedy some of the issues its rise has created – especially in regards to plagiarism.

What if an outlet caught plagiarising content was immediately removed from the Google News index? And/or had its Pagerank trashed?

What if Google set a benchmark for what it considered fair use and asked all those outlets who wished to be included in Google News to abide by those guidelines.

You’d solve the plagiarism problem almost overnight.

For many sites, being thrown out of Google News would instantly cripple them. No one would risk running plagiarised copy. In fact, editors might even get vigilant about checking that contributors pieces haven’t been overtly ‘inspired’ by the work of others. If getting busted for plagiarism potentially threatens your financial viability, what outlet wouldn’t attempt to put checks and balances in place. Who knows, sub-editors might even become employable again!

Google should demand that any re-used copy/quotes come with attribution to the original source and that it adheres to re-use guidelines if they are published clearly and obviously on any article.

Some will say Google is but a search engine and its not its charter to decide what is and what isn’t plagiarism. But Google News has rules laid around it, already. To be indexed you need to be a multi-authored content site, with easily attainable contact information and so forth. It’s a human-managed process that would only need to be expanded to take into account copyright guidelines.

Google also has to make policies and decisions relating to black-hat linking and SEO. If sites are caught buying links they can have their pagerank reset so again, its not a big leap to do the same regarding what is essentially blackhat publishing.

And even if Google didn’t want to give itself that responsibility, it could instead work with an industry body established by large media companies and publishers that could define guidelines and make decisions on when those rules have been broken. I would suggest those guidelines should put the emphasis on publishers to state clearly their accepted re-use policy and those who choose to re-use copy simply need to comply with those guidelines. If found to have not done so, the body would simply inform Google that the infringing site has breached and then Google and other news aggregators who opted in to the system would then remove said sites from their indexes and/or downgrade their overall site rankings.

Legal solutions to this problem won’t work – the economics don’t add up. But that shouldn’t matter – we operate in a link economy and Google is already the overlord of the link so this really would be just a natural extension of its responsibility to the Internet community and its media partners. And I guarantee it would all but put an end to commercial plagiarism almost immediately.

If you think this idea could float I encourage you to spread the word. There are a lot of movements like Data Portability that the web community have managed to champion but the media community seldom seems able to come together to do anything yet the plagiarism is one massive problem that threatens everyone’s livelihood and yet I believe, as described above, is relatively easily solved. Even if the media industry doesn’t go as far as forming an official industry group, a group like dataportability.org could at least be formed to
start engaging with companies like Google to work towards a solution.

Filed under: Plagiarism

Categories

Follow

Get every new post delivered to your Inbox.

Join 30 other followers