How Google’s Penguin Update Has Influenced Content
As a part of Google’s continued efforts to eliminate webspam from the Search Engine Results Pages (SERPs), they have focused on promoting sites that have high quality content and penalizing websites that offer a lesser user experience. A lot of the focus has been on the quality of the links that sites are using. Several of the less reputable article directories are no longer contributing to SEO but generally normal, organic links that people make when writing a blog are only having a positive effect on their page ranking. The real thrust of the Penguin update to Google’s search engine algorithm is the elimination of sites with thin, low quality content and this has begun to influence the way that webmasters are looking at the kinds of things that they post on their sites.
In the article that announced the Penguin update1, Matt Cutts posted an example of the kind of content that Google classed as “webspam”, which is obviously a keyword-stuffed page of illegible copy designed to hold as many links as possible. He also indicated that Google’s published standard of quality2 was the best guide to the sort of content that the search engines will favor. These standards indicate that a reasonably well-written article that offers an experienced view point on a subject will get a fair placement in the SERPs. As this is the sort of content that is the most likely to be linked to as a reference by other websites, or shared on social media, a natural amount of links, both inbound and outbound, is also expected as are relevant internal links.
The use of keywords has been a confronting issue for over a year as webmasters tried to find the right balance that would deliver good SEO while not looking too keyword-stuffed to the crawlers. The updated algorithm now rewards the variety of related or synonymous keywords as well as the long tail keyword phrases that are associated with the subject of the content. This is a part of a general move towards semantic search by all of the search engines to make search more flexible and interpretive of users’ intentions.
All of this has been aimed at the webspammers in an effort to reduce the ways that black-hat SEO can manipulate the SERPs, and the sites that have been hurt the most are those that have collected spurious links and those with thin content. Good webmasters are continually trying to post valuable content and to build organic networks for links and affiliate arrangements and these updates will only make their jobs easier by removing a lot of the low quality sites from high ranking positions in the SERPs. For every page that loses position, another – probably better – page improves its position leading to an overall better experience for search users.
This has focused more of the internet marketers’ attention on the kind of content that they are using. The days of being able to post a few hundred words of sales copy with some links to product pages and buying SEO in the form of cheap backlinks are long gone, and the new zeitgeist of IM is on high quality content that builds an authoritative internet presence. Building an archive of quality content is a long-term marketing strategy which can ensure that best websites’ longevity at the top of the search results. For many businesses, creating a constant supply of good content for their website can be a challenge. InfintechDesigns.com is an excellent source for help and advice on effective search engine marketing content and strategies.
References:
1. http://googlewebmastercentral.blogspot.co.nz/2012/04/another-step-to-reward-high-quality.html
2. http://googlewebmastercentral.blogspot.com.au/2011/05/more-guidance-on-building-high-quality.html