With the increasing updates of Panda algorithm, the major search engine Google has had more effective in its quest for sites of low amount and poor quality content with the intention of penalizing them. That is completely right of minimal websites where pictures abound due to the content which is barely any to be found. Unfortunately, websites that have not much content for users, above the page fold, are also sorted with minimalist sites by almost every search engine. Websites which’s content is plagiarized and is found on web; is also under continuous penalties or possibly may result in ban of the website by the major search engine Google. So, how to avoid the mistakes & prevent your site from being attacked by these Google algorithms, we’ve sorted some important or just simple tips for you, which can help you to determine the future of your site.
Use Robots.txt file Carefully
The Robots Disallow technique is better, even though it must be used carefully, as it might result in removal of the complete website from the process of indexing. Due to this reason only the pages that include duplicate content should be disallowed. Everyone who would like more elaborated details regarding the robots.txt files can find excellent sources lingering over the topic.
Take Care of Keyword Density & External Links
Keyword density is now very important in order to be on safe side from Penguin, you’ll need to have a maximum of 3% keyword density for your articles. Means if you wrote an article of 500 words then make sure you use the main keyword just 15 times, else use synchronous. Moreover, take care of your links which are going outside from your website, those links should always be added a rel=nofollow or rel=external nofollowattribute, so that Google Penguin can’t affect you.
Work more for the users
If you always kept the user in mind instead of search engines then I’m gonna say you are working just fine, because when users like your content, comment on them and share them on social media, the search engine will automatically rank your content for a better position in SERF. Keep your SEO as simple as possible.
Re-submission of your .xml Sitemap
If you have found low quality content on your site, and then you removed those poor content from your site, make it sure you re-submit your sitemap to Google webmaster tools, so that it can re-index your full site without the remoed posts/pages which were low quality. This way you made it sure, you are on safe side.
However, always take a close look to your website’s posts, if you have low quality posts then removing them from your site as well as Google webmaster tools will be a good practice.
‘Nofollow’ Internal Links
There is a very well-known approach that assists to unite the effect from all the above written approaches. When the ‘nofollow’ assign is added to the links on pages that must be indexed, but point to pages that must not be indexed, that is better detail for every search engine. They understand that the plagiarized content on pages that must not be indexed is published there for the purposes of users, and not for profits to be gained from online searches.
Using Noindex/nofollow in Meta
This is the simplest and the easiest approach to implement, and is normally used on pages like blog categories, archive or tags pages; in such pages the content has the only purpose of assisting readers to find the pages they are looking for. Additionally to the elimination of these pages from the index, the addition of the noindex and nofollow to the Meta tags the information to get the wanted results.
If you are on WordPress then you can use “Robots Meta” plugin to no-index & nofollow all of your categories, tags, archives and other duplicate links, and if you are on blogger then just make sure you follow this post: Blogger SEO Tips
No comments:
Post a Comment