How to stop Google from showing a now deleted site in search results?

Whilst building and testing a website I had it loaded at my temporary webspace.

Since then I have loaded the final site to it’s final hosting space and deleted the test site.

I’ve just noticed that Google is still listing it when i do a search.

How can I tell Google not to bother crawling these pages anymore seeing as there is no site there? Or do I just wait for Google to realise it’s not there anymore.

Many thanks.

You can tell google site moved in webmaster tools.
You can also put a 301 redirect in the old site to redirect to the new
In the future you can save yourself some trouble by setting your temp project to:
META NAME=“ROBOTS” CONTENT=“NOINDEX, NOFOLLOW”

4 Likes

Thanks for this info. I didn’t set up a webmaster page for the temp site so how would I tell google? Would I do it using the new site’s webmaster tools?

I can’t put a 301 redirect because the old site doesn’t exist anymore, it’s not live.

In the future if I do this with my temporary sites, do I put that in the rapidweaver site or in a robots file?

Thanks for the advice :slight_smile:

Ah I think I’ve done it on the dummy host site. Even though the site isn’t there anymore as a subdomain, I’ve put a redirect on the url, we’ll see if that works. It can take some time to populate through to Google I guess?

1 Like

I will try to be a little more clear but keep in mind I’m not a SEO or Google pro you should consider researching this.

301 redirect use If content is moved
410 if content is permanently removed you should tell the bots with a 410 not a 404 This will result in removal from the Google index (eventually)
404 is not found but may be available later (okay to try again later)

NOFOLLOW , NOINDEX
You can use a robots file and also put the meta tags in the page its self (I do both)

Webmaster tools, I think you should still be able to do it, you may have to upload some code to the dev site to authenticate to Google you are the owner first.
You can also use webmaster tools to manually block url’s but it’s temporary
Here is Googles how to on blocking url’s

More tips for the future:
On you development site DO NOT auto create a site map
Put your page(s) behind a password protect - pagesafe / sitelock

2 Likes

Hi Scott, Sorry I’m slow to respond to this one.

When you say don’t add a site map to a test site, I thought you could add a sitemap but ask it to disable all search engines, I use Sitemap plus, is that wrong?