I am getting a few people referencing old pages from a previous version of my website and want to delete them so they don’t show up on Google.
I am accessing the website files on the FTP by using CyberDuck.
Is it just a case of deleting the old files and folders from within CyberDuck and then they will be gone from Google search?
So for example delete this folder in the image where I have previous info from a Jobs page. So I would just delete that folder?
Deleting the files and folders will not remove them from SERP’s(search engine results pages). They will continue to show up as search engines store data for quite some time. If you delete the old pages your customers would get a 404 not found when they click on the SERP entry.
You should probably consider doing 301 redirects to the equivalent new page first then delete them.
Hi!
As RW publisher does not remove anything from server you do need FTP client to remove old content. If you are unsure what to remove and it does not bother site owner you can always just delete everything related RW project and Re-publish all via RW with proper robots.txt to say Google what pages to index. Other part will be ‘removing’ indexed pages with Webmastertools. I say ‘removing’ because it’s not that straight forward as Google is saying that some cases it’s matter of letting indexed page fade into oblivion by time.
Anyway, I had to remove some old ‘pages’ recently. Site had gone trough few iterations and some pages that had go trough indexing was no more relevant. As they simply didn’t exist any more there was no need for redirects for saving SEO juice (some one correct me if I am wrong).
Simply Google your URL and check search results for your site URL’s that are no longer valid.