Website's Indexing Status
All uCoz websites have Indexing status that is displayed at the top of the Control Panel's main page (/panel/?a=cp). The parameter shows whether indexing by search engines is allowed for the website or not (whether the website is in quarantine).
The indexing status can show one of the two options: "indexing is allowed (quarantine is removed)":
Or "indexing is prohibited (the website is in quarantine)":
The status "indexing is prohibited (the website is in quarantine)" is assigned by default to all newly created websites.
Quarantine Removal Policy
A website can become available for indexing either automatically (if a premium plan is purchased) or upon the website owner's request. If the website does not have a premium plan and the user wants the quarantine to be removed, a request should be submitted from the website's Control Panel:
There will be a pop-up window with the info on the quarantine policy:
After the request has been submitted, the website will be checked automatically according to a number criteria: the website's age, presence of a custom domain name, content, verified phone number etc. On the basis of these criteria the system decides whether the quarantine should be removed. We cannot provide a more detailed description of the algorithm.
Note! If the quarantine removal was denied, the next request can be submitted no sooner than in 7 days.
A website's robots.txt file is located at http://your_website_address/robots.txt. A website with the default robots.txt is indexed in the best possible way – we set up the file in such a way that only pages with content are indexed, and not all existing pages (e.g. login or registration page). Therefore uCoz websites are indexed better and get higher priority in comparison with other sites where all unnecessary pages are indexed.
That's why we strongly recommend not to replace the default robots.txt by your own.
If you still want to replace the file by your own, create a text file using Notepad or any other text editor and name it "robots.txt". Then upload it to the root folder of your website via File Manager or FTP. Note: while website indexing is prohibited, no modification of the robots.txt file is possible.
The default robots.txt looks as follows:
Robots.txt during the quarantine looks as follows:
Informers are not indexed because they display information that ALREADY exists. As a rule this information is already indexed on the corresponding pages.
Question: I have accidentally messed up robots.txt. What should I do?
Answer: Delete it. The default robots.txt file will be added back automatically (the system checks whether a website has it, and if not – adds back the default file).
Question: Is there any use in submitting a website to search engines if the quarantine hasn't been removed yet?
Answer: No, your website won't be indexed while in quarantine.
Question: Will the robots.txt file be replaced automatically after the quarantine has been removed? Or should I update it manually?
Answer: It will be updated automatically.
Question: Is it possible to delete the default robots.txt?
Answer: You can't delete it, it's a system file, but you can add your own file. However, we don't recommend to do this, as was stated above. During the quarantine it is impossible to upload a custom robots.txt.
Question: What should I do to forbid indexing of the following pages?
Answer: Add the following lines to the robots.txt file:
Question: I have forbidden indexing of some links by means of robots.txt but they are still displayed. Why is it so?
Answer: By means of robots.txt you can forbid indexing of pages, not links.
Question: I want to make some changes in my robots.txt file. How can I do this?
Answer: Download it to your PC, edit it and then upload it back via File Manager or FTP.
I'm not active on the forum anymore. Please contact other forum staff.