Google’s tough new stance on duplicate and low value content means businesses will have to rethink their SEO strategies, according to an SEO expert.
Google recently launched an algorithm change in a bid to tighten duplicate content filters and more accurately identify the original source of duplicate material, which means searchers are more likely to see sites containing original content as these will be ranked higher.
Adam Bunn, SEO director of search marketing and technology firm Greenlight, says the move will come as welcome news to many Google users.
“But as ever, when Google makes a relatively big change to its algorithm, there have been reports of collateral damage where the change has affected sites whose owners feel Google has mistakenly identified their site as having duplicate or low value content,” Bunn says.
Bunn says businesses that have recently lost rankings should identify their affected pages and then search Google for some “short snippets” of text from the pages, encapsulated in quotes.
If the page does not rank first, or at all, this indicates the problem could be related to the issue of duplicate content.
“If firms have noticed any significant changes in their online rankings… which do not recover by the next few days, that would be suggestive of their not being normal or typical temporary ranking fluctuations,” Bunn says.
“It is possible their site has been affected by the update. This will be either because the site’s content is not entirely original, in which case it will need to take steps to correct this [or] it could be as a result someone else having the respective site’s content and Google mistakenly assuming their site is the original source.”
“In this case, the respective sites’ appeal options are limited, and it is often easier to change the content anyway.”
Bunn says in light of the change, businesses should ensure their pages:
- Have a sufficient amount of original text content, supported by images, videos and other multimedia as appropriate.
- Are rapidly indexed by the search engines. To achieve this, the site should be regularly linked to, necessitating some kind of link acquisition strategy, and new pages should be submitted to the engines via XML sitemaps and featured on the homepage or another highly authoritative hub page in the respective site until they have been indexed.
If the site has a blog, make sure it pings the search engines when a new post is published, and then use the blog to publish or link to new content on the site.
- Are linked to and/or cited directly by third party sites. Since it is rarely practical or economical to actively link build for every page in a site, consideration should be given regularly as to why someone would naturally link to the site’s pages or share them on Twitter, for example. If you cannot think of a good reason, you may need to go back to the drawing board.