Editorial Policy and the Search Engines: Who Decides What is Spam?

Recently I was involved in a discussion with some internet colleagues about the relative merits of two sites for writers. I am not writing this editorial in order to run down or disparage either of those sites, even though they are in a sense my competitors. What I want to do is to contrast their respective editorial policies, as a way to explain my own philosophy about writing for the net and what I want PubWages to be.

So let’s just call them Site A and Site B. Site A allowed writers complete freedom to choose their topic, their style of writing, and the length and depth of their coverage. It allowed tiny snippets of poetry, articles that consisted almost entirely of images, and also very long and detailed treatises by experts. It allowed everything else in between. It was a writer’s paradise and also a marketer’s paradise, because there were very few rules and restrictions. Some people made a lot of money there. Some made no money at all. But most everybody was happy, because they got to do their own thing. Payment was in the form of revenue sharing. Who decided how much revenue you made? The market. And that means: the readers!

Site B was very strict. It told people exactly what they could write about, from a list of permissible topics. It had parameters for style and length. It determined exactly how deep or broad the coverage of the topic was going to be. And it paid up front for work done. So who decided how much money you would make? The management of Site B.

As a result of these competing editorial policies, Site B had a very consistent quality to it: it wasn’t very deep and it wasn’t very creative, but you could count on it to deliver a certain kind of writing product. Site A was like an old used book shop, where you never know what you will find. There was no consistency in topic. There was no consistency in style. But there were real gems there to be discovered, among all the heaps and heaps of junk.

As both a writer and a reader, I liked Site A better. But one day Google decided that there was too much “spam” and “garbage” on Site A, and they penalized it terribly, so that even the best articles did not show up in search. Site B, with its consistently maintained standards of mediocrity, on the other hand, was not penalized. As a result, I hear that many writers who used to like Site A better are reluctantly conforming to the restrictive editorial policies of Site B. They are also submitting to a pay scale that is not directly determined by the market.

Now here’s what I think about the decision by Google about the rankings of the two sites: creativity was stifled for the sake of uniformity.  Management that stood as an intermediary between the reader and the writer was preferred over management that just took care of the technical end of things, allowing reader and writer to interact freely.

But here’s what one of my colleagues has to say about that: “No, Site A were not penalized for allowing broad topics and creativity. They were penalized for allowing the site to develop into a cesspit of garbage, which totally drowned out the efforts of those who provided good content.”

You know what? Those two things go hand in hand. Allowing for creativity is also what allows for garbage. It’s the laissez faire attitude at work. If you start to require people to meet a certain formal standard, then it will also impinge on their ability to do their very best in their own way. This principle works the same everywhere it is applied, whether in a privately run site on the internet, or in the market as a whole. The moment products have to meet a certain standard set by an authority, rather than being tested by users, then many potentially wonderful new products are never made available, and all in the name of protecting the consumer.

The problem is that the search engines, and Google in particular, are no longer acting as mere search devices to let people know what is out there and to make their own decisions.  They are now trying to protect the consumer from spam.

What is spam? The last time I checked, it was a message received that was of no interest to the recipient. Whether something is spam is entirely in the mind of the individual who reads it. But articles on the web cannot be spam, because they are not messages sent to your email. They are more like books in a library. If you look up a book in the card catalog, thinking it will help you, and then find you don’t like it, do you blame the publisher of the book? Or do you just put it back on the shelf and look for another one on the same topic?

It’s not Google’s job to protect searchers from finding something they don’t like. It’s Google’s job as a search engine to let you know what’s available, given the search terms you used.

After Site A was hit hard by Google, they tried to change. They tried to make things more pleasing to the powers that be. They thought that maybe there hadn’t been enough rules in place. So they started making arbitrary rules about pixelated images and the ratio of copy to ads, thinking that would make things better. It didn’t.

I know that some of you are going to come back and say things to defend Google or Site A or Site B. So just to anticipate what you’re going to say, I’ll say it for you. Google is a private business, and it has a right to change its algorithm any way it likes. I agree. And we as consumers have the right to choose a different search engine, if we don’t like the results. Lately I’ve been using Bing, because they don’t seem to be working as hard as Google to change my search results to what they think the average person would like to read, rather than what I asked for.

Site A and Site B are private businesses, and they have every right to choose whatever editorial policies they want. Also true. But amazingly enough, many writers are defecting from both sites in order to join PubWages and other smaller, independent sites such as ThisisFreelance, XobbaFreelancewriternetwork and Excerptz.

It’s good to have choices. It’s good that there is a free market. And it’s also useful to realize that any attempt, whether by a private entity such as an editorial board or a search engine, or by a public entity, such as a regulatory agency, to protect consumers from products denies access not just to the worst the market has to offer but also to the best.

© 2011 Aya Katz

About admin

I am a publisher, linguist, primatologist and writer. I am an editor at Inverted-A Press. I'm a primatologist with Project Bow. And I administer PubWages.
This entry was posted in Marketplace and Trading, Opinion Pieces and Editorials, PubWages Staff and tagged , , , , , , , . Bookmark the permalink.

14 Responses to Editorial Policy and the Search Engines: Who Decides What is Spam?

  1. Mark Ewbie says:

    An interesting debate on the merits of letting the market, Google, or publishing sites decide on that hard to define quality thing. I have read some short bits of rubbish that have made me smile and long worthy pieces that haven’t. Which is better?

    From my point of view all I would like Google to do, which they are not doing, is to get rid of the spinners and copiers.

    • admin says:

      Mark, thanks for your comment. It would certainly be nice if Google could help get rid of scrapers and such, but strictly speaking, even that is not their job. I’m afraid we’ve all gotten used to the idea that Google is some kind of cyber police force, and that idea can come back and bite us, if we don’t watch out.

  2. Sweetbearies says:

    I do not want Google to delete the scrapers, but if a scraper is using Google Adsense to promote their stolen content, then I hope they will at least remove my content from that site. Or, not allow scrapers to advertise with Google. Now I have noticed most blogger blogs that copied my content were asked by Google to take that down, and they are still up. If Google is offering advertising to a scraper site, they become responsible for it.

  3. Interesting points, there will always be those that would rather copy another writer’s work than do the work themselves, nothing we can do about that without hurting everyone.
    The power in and always has been, what the market demands. Google with all its insights as to what the market wants or needs seems to be interested in markety trends rather than good writing / writers points of view.
    Site “A” & “B” both have their merits to be sure. I believe the market in the end will weed out the weeds…

    • admin says:

      Mike, thanks for your comment. Of course, the scrapers and the copiers are beyond contempt. And, it’s true, as you said that generalized measures that are meant to get at them can end up hurting everyone.
      Google has no way of telling what is good writing. It’s not their fault — artificial intelligence just hasn’t advanced that far. But, what _is_ Google’s fault, and I think we should call them on it, is pretending that their algorithm can tell what is or is not a quality site. No two people agree on that 100%. How can a computer program tell?
      Ultimately it is indeed up to the market…

  4. Pamela99 says:

    Thank you so much for this article as I understand more clearly now about what happened and how it can and will continue to dictate standards. I agree with Mark’s comment as well.

  5. Pingback: Article Directories: No Rules, Many Rules or Few Rules? | Excerptz

  6. Sweetbearies says:

    I think a good solution might be if Google, Yahoo, and all the major search engines could give us of option of only allowing our version of an article to appear in the search engines. For instance, if a scraper site attempts to copy more than say thirty words from an article, other copies besides the original would not show up in the search engines. The scrapper site would not be shut down, but they would not be rewarded with people being able to view copied content in the search engines.

    • admin says:

      Sweetbearies, I agree that the original should have precedence over the copy, although that isn’t even a question of the quality of the content. It’s about the intellectual property rights of the person who wrote or owns the content.
      Interestingly enough, Google will penalize you even if you copy the same content onto two sites, when you are the one who wrote it, and you have every right to do that, from a legal standpoint.

      Now, from a purely logical perspective, if two sites have identical content, they should rank exactly the same, if we think Google is measuring the quality of the writing. But sadly it is often the scraper site that ranks higher! How this happens is a mystery.

      • Sweetbearies says:

        I noticed the scraper site that copied my pub is still doing really well. All it is is a conglomeration of scraped material, and they admit as much in the DMCA counter claim. I find it interesting Google does not pick up on this. I pointed out to them if the person is not even writing a counter claim in English, there is a good chance all their material is copied off other sites. I agree, no search engine can rank quality. It is a farce if anyone thinks it could.

        • admin says:

          I think those people who copied your article are not even trying to claim authorship. They probably are arguing that they had permission to copy it. You would think that it would be enough for you, the author, to assert that you gave no permission! But if they are ranking higher on the same content, this just goes to show that the Google algorithm is not content based. As a webmaster, I would like to know what they did to rank higher! That would be useful information.

  7. Pingback: Article Directories: No Rules, Many Rules or Few Rules? | PubWages

Leave a Reply

Your email address will not be published. Required fields are marked *