Back from vacation, trying to fix this thing.
In the meantime got another e-mail from them on 13 August :
: Googlebot can't access your site
August 13, 2015
Over the last 24 hours, Googlebot encountered 37 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 37.8%.
You can see more details about these errors in Webmaster Tools.
If the site error rate is 100%:
Use a WHOIS tool to verify that http://thassos.ucoz.com/
has a proper whois record and that nameservers are configured for the site. If not, contact your domain registrar to update your whois records.
Using a DNS lookup tool, verify that the nameserver's name can be resolved to an IP address. If not, either update your whois record to contain an IP address for your nameserver, or update the DNS records for nameserver.
Using a DNS lookup tool, verify that http://thassos.ucoz.com/
can be resolved to an IP address. If it can't, update the DNS record for http://thassos.ucoz.com/
on your nameserver.
If the site error rate is less than 100%:
The most likely explanation is that your nameserver is overloaded. Contact your hosting provider and discuss the configuration of your DNS server and the possibility of allocating more resources to your DNS service.
If your site redirects to another hostname, another possible explanation is that a URL on your site is redirecting to a hostname that has a DNS problem. Consider checking the links to which your site redirects and make sure that the sites in your redirect links don't have any DNS issues.
After you think you've fixed the problem, use Fetch as Google to verify that Googlebot can properly access your site.
Learn more in our Help Center.
I did ask them to fetch again. What could be the problem ?
Now back to you bigblog.
The original text means that there have been corrected many errors regarding the analysis of the Googlebot (it's not about robots.txt).
OK i can understand that.
The stat counter should remain there, because that's how the days of inactivity is counted. If it's not there, your site will be considered as inactive, and it will be suspended after 40 days, even if it has enough visitors.
Got it. It stays untouched.
As per Google's affirmation, there is nothing wrong if the scripts are in <body>, not in <head>, so a solution could be to remove the scripts from head and to put them to the "Bottom part of the website" template. For this, do the following steps:
1. Go to the templates, and replace </head> with </<?'head'?>> and </title> with </<?'title'?>> - the scripts are deleted now.
2. Delete the following code also:
<link type="text/css" rel="StyleSheet" href="/_st/my.css">
3. Go to the "Bottom part of the website" template, and insert the necessary scripts and styles:
<link type="text/css" href="/_st/my.css" rel="StyleSheet">
<link type="text/css" href="http://s102.ucoz.net/src/base.css rel="StyleSheet">
<link type="text/css" href="http://s102.ucoz.net/src/layer6.css rel="StyleSheet">
<link type="text/css" href="http://s102.ucoz.net/src/ulightbox/ulightbox.css rel="StyleSheet">
<link type="text/css" href="http://s102.ucoz.net/src/gstoolbar2/css/style.css rel="StyleSheet">
<link type="text/css" href="http://s102.ucoz.net/src/gstoolbar2/css/share.css rel="StyleSheet">
Don't understand how this will help. Google seems to only want a robot.txt with java/css allowed.
khen, this mass email message was sent to owners of many websites, not only uCoz ones (Wordpress, Joomla etc.). At this moment it looks more like a mistake because it was also sent to websites that don't have anything closed in robots.txt at all. Therefore, I suggest that you don't change anything for now. I'll post here as soon as have new info.
Any news Sunny regarding this.
I don't want to play with robots.txt since uCoz usually knows best what to block and what to not block since it's their system.
Added (2015-09-06, 1:11 PM)
Also how to fix : Optimize CSS Delivery of the following: