The first Panda update (also called as Farmer update) by Google since 23rd February, 2011 many small and large websites/blogs have been victimized so badly. Where some of those sites did manage to survive in this post-Panda age (this algorithm update gets pushed frequently by Google), a majority of the victims are still struggling to get back to their natural pick where they were supposed to be after getting affected by several Google Updates. The main reason behind these updates from Google was to eliminate the spam contents from the its Search Index. Google wanted to downgrade those sites which have too much low quality content.
But there are many examples of medium or large websites/blogs which have been wrongly affected by the Panda Algorithm though they serve enough quality content at their websites. So, there are some points where Google algorithm fails to differentiate between quality content and spam content. At late 2012, my blog which had a decent amount of visitors coming from Google SERP was also affected by the Panda Algorithm. Since then, I was trying to recover from panda. After more than 26 months (2 years and 2 months), I started recovering from panda slowly(my guess). Here is how I did.
It is hard to recover from Google Panda effects because you can not guess what the actual problem is. Duplicate contents; thin contents; contents with poor language, grammatical errors – more such things can be added into the list. You might be familiar to the above terms and I guess you know how to deal with them. If not, you can search the web and see several blogs that have been discussing on those issues. Here is an interesting article.
Apart from the content part there are also some technical parts for which your site can be affected by Google Panda algorithm. What I discovered are some internal errors of your WordPress blog which are supposed to play an effective role behind the Google penalty. Though WordPress is being updated continuously from the date of its birth it is still not completely flawless. One such flaw is 404 Soft Error. However, I will be discussing about this later in this article.
Another problem with WordPress is that it creates several duplicate contents within your own site if you do not configure it properly. I’ve discussed all the points one-by-one for setting up WordPress to recover from Google penalty or keep your site safe from the Google algorithm updates.
1. Fix 404 Soft Error in WordPress
The first thing you would like to ask is “What is 404 Soft Error?”. Well, the 404 Soft Errors occur in WordPress when some pages of your site has no content but does not return proper 404 Not Found server status. Suppose, you’re searching within your blog using the WordPress internal search option. Generally, the search is done by adding the search term as parameter with the URL of your site (http://yourdomain.com/?s=serachparameter
) and it shows the results at http://yourdomain.com/search/serachparameter/
page.
If you search for something that does not exist in the site, it will show the page with empty result saying “The content you’re finding does not exist” or something like that. Here the 404 Soft Error occurs. The search result page is an error page which visually displays the “Not Found” error but the server does not return a proper 404 Not Found status for that page.
Okay! But, how can I say that it will invite Google penalty? Google strictly mentioned in their Webmaster guidelines that they discourage to have 404 Soft Errors with your website. Here is the link.
How to Fix it?
You can fix this problem by adding some codes into your WordPress template files. Here is the detailed tutorial on how to fix 404 Soft Errors in WordPress.
2. Avoid Blocking Resources by Robots.Txt
There were times when Google used to suggest block the codding part (php, css or javascript files) of your site using the robots.txt file as they are almost same for all WordPress sites and do not worth indexing by Google. But now, Google is more smarter and they can easily differentiate between the actual content and the coding part of your site and Google won’t index your coding files. Dramatically, they’re now asking for not to block the coding section of your site as the crawlers need to properly understand your sites structure (source). If you block your resources, you’re actually preventing the Google bots from indexing your site flexibly.
The standard robots.txt for WordPress sites should be like this.
User-agent: * Allow: / Disallow: /wp-admin/ Sitemap: http://www.yoursite.com/sitemap.xml
Never block your theme files, CSS or JavaScript files.
3. Make your WordPress Site Mobile-Friendly
The mobile devices are going to be the future of Internet and websites. Keeping this in priority, Google is now concentrating more on mobile-friendly themes. They calculate the mobile-friendliness as a ranking factor. Therefore, make your WordPress theme responsive. You can check whether your theme is mobile-friendly or not from here.
Here the robots.txt file comes again into the discussion. If you block your theme files, Google won’t be able to decide the mobile-friendliness of your site.
4. Do Not Add NoIndex Meta Tag on Important Pages
I’ve seen many websites which add noindex meta tag to their contact page, about page, privacy policy page etc. But, it is very bad as Google counts those pages as a crucial part of a website. No-indexing those pages is not a good practice for your search visibility.
5. Don’t Hide the Date from Your Posts
Some SEO Experts 🙂 suggest to remove the date from the posts to make them ever green. This approach seems to be idiotic. Google indexes your posts with proper time stamps at their own archives. They do not show that publicly however. Hiding the date from your pages does not mean that you’re making Google a fool. You’re actually doing it to yourself. You can’t play with Google.
If you want to make your posts ever green at the search result pages, show the last updated date into your post which will show the date of updating your posts.
6. Improve Site Performance
The site performance is another metric for Google search engine ranking. Make sure that your site loads fast. Improving the page load time of a WordPress site is a challenging task though. Here are some points where you should concentrate to improve your page load time.
- Choose paid host for your blog instead of the free hosts. Free things can’t give the quality that the paid services offer.
- Host the images of your WordPress site to a sub-domain. This will make less HTTP requests and thus making your site load faster. Using a CDN to serve your images would be a better practice. We suggest MaxCDN as it is cheap and reliable. Here is how to configure MaxCDN with WordPress.
- Differ loading Javascripts.
- Load Javascripts asynchronously.
- Use cache plugin like WP Super Cache, WP Rocket, W3 Total Cache etc.
Bottom line: If you configure your WordPress properly and write quality content, I am pretty sure that you will be able to overcome from any Google Panda effect. Though it may harm your site initially, the Panda Algorithm can help you a lot to fight against the scrappers. So keep your hard works on and make your WordPress site.
Hi Vikram,Really sad to hear about your website. I have checekd your URL that you have mentioned in the form. I have seen it is e-business website, and you have really thin content on it. To easily recover from Google Algorithm, i will suggest you to do some social media optimization and send out some social signals to Google bots.