05
June
2014
|
08:32 AM
America/Los_Angeles

Content Marketing: Don't Trust SEO — Publish Content For People And Not Robots

There's a huge quality mismatch between content created for SEO purposes, and human edited content. With an explosion of content marketing — it is vital to understand the difference.

Every company is a media company and now every company wants content to publish but how do they judge if it's any good?  

On the blog of Virante, an SEO marketing agency, Russ Jones provides a case study on how to determine the quality of three separate articles on the same subject, produced by writers at services such as WriterAccess, TextBroker, and ContentRunner: 


But here in lies the problem — exactly how good are these writers?


His answer is to use a tool that scores the articles on how relevant each will be to Google.

But here in lies the problem — what's good for Google rank is not the same as what people consider to be good copy. People are far more discerning than an algorithm and you can see it in the three content examples in the above case study.

None of the three articles are any good and the one with the highest SEO score is the worst of the lot.

If you are a website publisher you have a serious problem if you choose to publish content edited for SEO purposes. SEO techniques might improve your traffic but they can't improve the quality of your content.

What's the point of impressing Google but when visitors come they bail because the content is dull, it has nothing original to say, and wastes their time?

The Googlebot will revisit but human visitors won't.

Good content will create a far more lasting and favorable impression with visitors than trying to brown-nose Google's fickle and always changing algorithm.

Each time you have content that connects with people you will move higher in their internal "pagerank" assessment of your site. It's a good thing.

And if you do it again and again, you will rise higher in people's respect for your site and they will more likely return. It is very high quality traffic and it is Google-free. That's gold.

Google traffic is low quality because it's the SEO that draws visitors rather than good content.

People are far smarter than Google. People quickly see bad content for what it is, regardless of how well it scores with search engines. 

Don't be impressed by the four-syllable "algorithm." 

The dirty little secret of algorithms is that they are a simple list of rules. Google's algorithm uses just 200 signals to understand the value of a web page.

The algorithm has to be simple to index the massive amount of Internet content that's expanding at a supernova pace. And the SEO industry works very hard to dumb-down Google further because it continually discovers exploits. 

Searching for purity...

Google hates the SEO industry so very, very much because it wants only pure signals of virality and popularity. It is constantly seeking values of meritocracy in content, which are untainted by fake signals of popularity generated by hired marketers and others (look out PR industry you are on Google's list of enemies). 

People use thousands of signals to evaluate the value of web content — there is no hope in hell Google's 200 signal algorithm can come anywhere close to matching human intellect. [It might be able to mimic a nematode with its 302 neurons.]

Your takeaway: Google's searchbot is clever but not smart. It has very poor comprehension of what it reads and sees. Do not rely on its likes and dislikes for content marketing.

Publish content for people and not robots. Plus, people have money and robots have none.