The reason why we delayed this post was that we were actually
experimenting on what actually caused a Global drop in traffic and why
isn't Google responding on this biggest traffic Tsunami ever. I remember
that from July 17th to August 17th MattCutts was on a 30 days
self-Challenge of not reading news or using social media. I guess that
challenge has already ended but is MattCutts even aware of the recent
Google Outage that took place on August 16th, when he was on this "No Social Media, No News"
challenge? :> We were not able to find a single piece of information
on what exactly happened on August 16th and therefore were forced to do
our own experimentation which at least improved our traffic by 10% and
we are further waiting for more productive results. The tips below are
optimization techniques for both Blogger and Wordpress Platforms.
Applying them is indeed the easiest and quickest job.
If you have not read our Case Study on Google Outage then please first read that:
What Might Have affected your Site Traffic this time?
All
services by Google went down for 5 minutes on August 16th, which
effected a global loss in traffic worldwide up to 50%-90%. We wrote a
comprehensive Case Study to prove that Google actually went into trouble
on two specific days and the company has not yet commented on this
technical blackout.
A similar failure happened
even on April 17th, 2013 when Google Apps, especially Gmail went down
for over 30 Minutes and people could not log into this largest cloud
based Emailing system. Google quickly commented on this failure by
informing Mashable:
We
are currently investigating issues with Gmail, Drive and Docs as well
as the Google Apps Control Panel. We are providing more information and
frequent updates through our Google Apps Status Dashboard, which you can
find here.
But
the problem this time is different. The outage this time effected
Google's organic search traffic. Well optimized sites saw a massive drop
of over 50-90% drop. The forums are crowded with anger and waiting for a
positive reply but looks like this time we need to help the sinking
ship ourselves.
Here are some of the problems and their solutions that might have resulted out of this latest Google outage story or new Google Penalty story:
Problem#1 : Precious loss of Indexed entries on Google servers
Google's
databases that stores indexed data might have been badly effected which
caused lost of several entries. Google bots gather newly updated
content using sitemaps which are XML documents. Whenever you update your
blog, your sitemap gets updated with a new tag entry, this entry pings
Google through your webmasters account and that's how a Search robot
gets informed when to pay visit to crawl and index the newly published
content.
Solution#1:
If
this assumption is the least right then a possible precaution would be
to refresh the sitemaps you have submitted to Google. Follow these
steps:
- Log into your webmasters account
- Go to Sitemaps
- Check all boxes and hit the Resubmit button. Give Google at least 24 hours to refresh the new entries.
Tip For Blogger: Submit sitemaps containing at least 500 URLS. You can use our Multiple sitemap Submission tool to create total number of sitemaps for your Blogger blog.
Tip for Wordpress:
In your XML sitemap plugin limit the number of posts to at most 500 or
1000. Add both these sitemaps in your webmasters account.
Problem#2 : Poor Quality Content [True Definition]
A
striking question that we often hear is Poor quality content and how to
identify that you have written something that is poor in quality? The
answer is simple, any post or page on your site that is of no value to
Google users. Now what exactly could this content be. Following are some
topics that come under this category:
- Tags
- Labels, Categories
- Archives
- Author Pages
- Paginated Pages
- Duplicate Content
- Content with only links
- Content with Less word count (less than 300)
- Broken Links
- Affiliate Links & Affiliate Posts!
I have taken care of almost all these types of SEO mistakes but the last one is something that I doubt this time and that is repeated posting on Affiliate products.
We often post on Discount coupons with repeated use of keywords and
product details. This pays no value to Google users but our Readers.
After having brainstormed ourselves thoroughly We have come to a
decision that such posting deserves to be tagged as noindex and
noarchive. Google rarely sends traffic to Pages which shares affiliate
products so it would be wise to take a safe side and noindex all such
posts that offers no value in long term to our organic search and are
meant to serve blog readers alone.
Solution#2:
We
have written several tutorials on how to solve issue of the first 9
Poor quality content issues and today we will share an extremely easy
and rarely applied method to make best use of your Affiliate posts and
yet stay on the safer side. We will talk about Poor content issue#10
i.e. Affiliate Links & Affiliate Posts!
Its pretty obvious that you must always add a nofollow attribute to all your affiliate links but what about the post itself?
Noindex Affiliate Posts, Archives and Search Pages:
For Blogger:
Follow these steps for BlogSpot blogs:
- Go to Blogger > Settings > Search Preferences
- Enable Custom Robots header tags and set these settings:
Note: You can now easily noindex archives and Search pages (labels, search results) by choosing the noindex and noarchive options as shown in the image above.
Click Save and doing this will add a new option called "Custom Robots tags"
in your blogger post editor. This option will enable you on how to
treat a particular post. Whether you want it to be indexed or you don't
want robots to crawl it. It has other options too for avoiding
translating of post but we are concerned only with the tags noindex and noarchive here that will tell robots neither to crawl nor keep a cached copy of the page.
Find
all such posts where you have shared discount coupons or where you
thing the post offers no quality and deserves to be tagged as noindex.
Simply select the noindex and noarchive tags for that particular post
and hit Update.
Note: This method doesn't add a meta tag under the opening Head tag of that particular page but it will instead add X-Robots-Tag header tag in the HTTP response of that page.
Check the X-Robots-Tag for my homepage:
For Further details please read:
For Wordpress:
Simply install Wordpress SEO by Yoast plugin and visit your post editor. You will find the noindex option just under the Advanced tab,
What then?
Once
you have noindexed all poor quality pages and posts on your site, now
it is time to refresh the sitemaps by resubmitting them to Google. This
will force Google to recheck the URLS and automatically remove the
noindexed posts from its entries. That simple!
Need Help?
Remember
that the traffic loss this time could be due to any reason, we can only
propose the best possible tips to ensure you get a step more closer
when it comes to a well optimized site. The poor quality content on your
site should be immediately removed by using the method described above.
You do not need to deleted poor quality posts, just noindex them!
Let
me know if you needed any help. Please do inform us back after a week
or so, to let us know how this method brought a change to your overall
site traffic. We will try our best to keep Blogger community updated
with all latest SEO techniques that are applicable. Wish you a happy
blogging experience and a happy Traffic recovery! Peace and blessings
buddies
11 Comments:
Thanks for This great post.... Sir I Need Help.. my site infected with malware and antivirus blocked my site.... google did not snd me any email about this on webmaster tools..... how can i clean it..??? plz Sir share a post about this topic.... Thanks advance...
www.Mudassarhussain.com
You are using Bidvertiser Ad Network and this network has some ads with malware. If you want to clean your site then do not use bidvertiser (or any other like) else you can continue it with malware.
W.salam Mudassar,
You must check all third party plugins that you may be using which asks you to install javascript on your template. Install all such scripts which creates popups or annoying ads on your site.
Faheem could be right, because Bidvertiser and similar sites are not trusted Ad networks
I have a such type of blog and I share giveaway , freebies , discount coupon code etc .
Khurram bhai ,,,, Can you give me some tips or advice about my blog ?
My blog (wp)http://www.ihatecracks.com/
Didn't I share the tip in the above post? :)
Google itself do something about this issue and atleast announce about this problem.
This is really awesome post thank for sharing , i lost massive amount of traffic to my blog i am very much feedup and sometimes i feel to quite ,i will try out the steps as you stated above hope will work fine
one more question , i webmaster tool under blocked url i have large number of url blocked by robot.txt how can i recover from can you please help me in regards please
Regards
Pavan
If you have sites which are blocked by robots.txt then make sure you have deactivated the robots.txt file by going to Settings > Search prefences.
the default robots.txt file is bets so you must not create your custom files.
@admin
i have dropped a mail to you please check , need your help man
Solution#2:
dear Friends solution 2 is to easy Go to Blogger > Settings > Search Prefernces
simple dissable Custom Robots header tags do not set any setting and click disable
Now you done
same process in Custom Robots tags and dissable now you done do not change any thing.
dear friend please search your links of your blog one link can harm your whole site so please save your link and save your site
i want to say i cover my blog traffic in 10 days and my traffic goiing batter to all of my friend blogssss and alexa rank is batter day by day.byee friends i will come back soon
you recover your blog traffic only by doing this?