Are there any Rapidweaver-specific best practices for website testing? I’m specifically looking for avoiding Google/Bing and other search engines indexing draft and not ready for prime time sites. “Modify your robots.txt” file is a common solution but this has nothing to do with Rapidweaver.
I’ve come across several Wordpress-specific solutions but the site in question has no WP components.
If there is a snippet that would serve the function of preventing indexing before the site should go public, that would help.
I don’t know if there’s a “Best Practice”. There’s a number of different ways to prevent indexing.
If you run a “testing” sub domain I think that makes the blocking of indexing much easier.
You can place a very generic robots.txt file at the “root” of sub domain. That will stop legitimate crawlers like search engines, however it provides no protection from others not so legitimate traffic both human and not.
You can also add security measures on the sub domain such as an htaccess and htpasswd files or a third party security product like pagesafe that will block indexing and everyone else that don’t have login credentials. This approach would be my choice as it also keeps competition out. @ben used to have a video on the htaccess file approach on the community site, but I couldn’t find it, perhaps it’s out on YouTube.
You can also on a page level mark them not to be index. This is built into RapidWeaver, but is the hardest and has the risk of forgetting to remove this before going live. That can quickly ruin your SERP rankings.
I have this same question too. I am just marking this post so I can find it again.
As it turns out, I had a copy of Pagesafe which I hadn’t used for a while. This stack solved the issue.
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.