Wondering if there is a way to share my website with a remote client so I can get feedback before I publish it.
If you have the ability to do it you could publish to a subdomain that is password protected through .htaccess as a ātest siteā. That way it keeps search engine bots from seeing it, but anyone you give the password can view it and comment.
Another option would be to export the site to a folder inside of a cloud service like Dropbox. You could then share the folder with your client.
Thanks bloguy. First option is the better one. I can likely publish to a subdomain. The folder publishing is a bit clunky. For instance I already tried that and when you click a link on, say, the home page it drives you back to the folder directory, not the next page.
If you get chance to listen to this weekās podcast (#15), Ben and Dan will explain some of your options too.
(OT: The teaās in the post.)
Can I ask a stupid question regarding this issue.
I have my own website www.stormdesignprint.com but currently when Iām making test sites I use another domain entirely which I pay separately for (www.stormtempweb.com). All of my works in progress are kept here.
However this means Iām paying for 2 domains and 2 sets of hosting.
Are there any drawbacks to me simply hosting my clienāts temporary sites on my own domain so it might look like this? : www.stormdesignprint.com/test/client1 for example?
Nobody outside of the client would ever know to look for it so I can imagine my other clients stumbling upon it accidentally, and it means I could drop the extra hosting package Iām paying for.
The only thing is, I can keep both Rapidweaver projects separate canāt I? Even though they are both using the same publishing settings?
I believe you can do exactly as you suggest. Certainly no problem if you try it because you could immediately put your stormdesignprint site back up if the new test site had indeed interfered with it. Which it wont, but you neednāt worry! Anyway - it seems like you already do this on the temp site - and presumably it works there?
Do the .htaccess thing though to stop google immediately showing it in results (likely chance!!). Here is a site that steps you thought this.
http://www.freewebmasterhelp.com/tutorials/htaccess/3
This is how I do it, but you should be wary of leaving it there for long as it wonāt do you any favours from an SEO perspective if you have duplicate content. Also, if your final version has different URLs you will need to set up 301 redirects if the search engines have found your demo site. Definitely use a robots.txt file to disallow search engines - not 100% certain but highly likely to work in the short term. And yes you can keep both projects separate because the demo site is self contained in its own folder.
I checked mine some months back and had 9 old demo projects left on there! Very easy to forget so now I diarise a ādelete demoā date.
ok thanks for the advice!
Hi there sorry for the delay in responding to this one. Manofdogz it wonāt matter if search engines find my demo site will it, because the url will be something like 'www.stormdesignprint.com/test/client1 but the live site will be www.endclient.com, and the two urls are unrelated. So even if search engines find the dummy site and list it, nobody is ever like to accidentally type in that test url are they? Because the only people to ever know it temporarily exists are the client and me?
If I DO decide to add a file to stop google trawling those particular pages on my site, can I do it with either a .htaccess file OR a robots file? Does it matter?
Password protection through .htaccess will prevent all the bot trawling.
sorry for the delay - I think it will matter because google will return searches on your server as well as you clientās server and they can take ages to disappear. Happened to me once when I left a ādevā version of a client site on my server by mistake - it was coming higher for key searches than the real site!
whatever you find easiest. robots file wonāt stop malicious crawling but will stop Google