Site disappeared from Google following httpS upgrade


(LJ) #1

URL: https://gebbyeaton.co.uk : Site has been around for about 5 years or so and has always done fine in Google for searches such as ‘female wedding singer in midlands’ ‘female jazz singer in Warwickshire’ etc. A few days ago I updated the site to https and it has pretty much vanished from Google - not a single hit on the new https version.

I cannot see what I have done that is so wrong that Google can make the site disappear:

  • htaccess forces https and non-www (as previously)
  • The padlock icon appears - no obviously unsecure elements in the site
  • all 4 versions added to Google search Console: http://, https:/,/ http://www., https://www and sitemap applied only to canonical version

I can’t say it is a perfect example of SEO but it’s pretty good:

  • mobile friendly
  • Most images have alt tags
  • Decent use of header tags
  • browser descriptions in place

So why the hell does it disappear? We’re not talking about a slip down the page, it’s a wholesale disappearance and the reality is this singer will lose much needed income over this which is pretty irritating when I’ve taken care to do things properly. I sincerely hope the other sites I’ve upgraded don’t all do the same thing!

Any ideas welcome


(Doug Bennett) #2

Those look both like explicit local keyword phrases. Local search results are quite different beasts. My best guess would be that any local citations like google my business still point to the old http URL. Again just a guess, but I know local Searches are quite more sensitive to things like NAP matching exactly on citations.
A good resource for local SEO:


(LJ) #3

I don’t think this is the issue. The site is targeted at the UK and there’s only one ‘Midlands’ and one ‘Warwick’. A week back and these searches (without quotations) would have given results on first or upper second page of Google. There has never been NAP matching since her address is irrelevant and the phone number is a mobile. Even if an increase in focus on Local searching was an issue it would only cause a drop in position not a disappearance.

Another oddity I noticed today is, when I do a site:gebbyeaton.co.uk search, all pages except the Home page show as indexed. Normally the home page is the first to show and is definitely the most important. In Google Search Console indexing is shown as ‘partial’. Hopefully it will appear shortly and things get back to normal.

One possible issue is that when I added a new sitemap to the secure domain, I deleted the old one from the non secure. Best advice from some quarters is to keep the old one until the secure site is fully indexed. Again however, deleting a sitemap should not have any great impact.

Whatever the outcome, it is worth paying attention to the fact that http to https does frequently result in a Google slip - ironic given the pressure Google is putting us all under.


(scott williams) #4

using the site:yourURL here search to see what Google actually has indexed would be a good first step.
Also what Doug said does have merit, the NAP is crutial in showing up in localized searches.

Also check webmaster tools make sure you dont have any messages or manual actions showing up against the site


(LJ) #5

Already done that - see last post - everything indexed except the home page oddly!

Nap would be important if there were 200 female wedding singers in the midlands, but there aren’t. I’m not saying it isn’t important at all but no way is it important enough to account for a disappearance off the Google radar. Her name is on some other localised wedding sites but her exact geographical situation is not important - unlike a shop you need to visit for example - it’s a broad area. I have a large number of sites that have no NAP emphasis but respond very well to local searches - mainly because they are not in highly competitive sectors

No error messages at all just the usual - “no errors in 90 days - nice” message !


(LJ) #6

OK - confession time…and I accept the award of Webmaster Dunce of Day! Some interesting questions arise however…

A short time back I made quite a few amendments to the site and also had a few display issues with the theme. So I temporarily published it to our dev server. Whilst it was on there I unchecked ALL “Index and Follow” check boxes in the Meta Tags area so that the temporary site wouldn’t get indexed.

…and who forgot to re-check them when publishing to the normal domain?!? That’ll be me …

What is interesting however, is:

  1. Despite ALL pages having no index no follow robots txt, every https page has been indexed EXCEPT the home page

  2. These changes happened a while before the https update, yet republishing with the the no robots txt made no difference to SEO until the site was published to https. Only then did things nose-dive. This ties in with a warning given by SEO / Google Penalty freak Marie Haynes, namely that pushing your site to https is effectively a site move and makes Google re-evaluate your site for quality.

  3. Finally - none of this made any difference to the site performance on BING where a search for Wedding Singer Warwickshire came 2nd to top

So, all is now published as it should be and hopefully Gebby will start getting hits and bookings again.

For my part - next time I do a temporary dev-server publication, I’ll copy the project and rename it!


(scott williams) #7

When I do a temp, i just throw a page safe stack in a partial on the top, that stops crawlers in their tracks. When finished, just delete it from the partial and re-pub.

Glad you got it sorted


(LJ) #8

Good advice - I’ll try that next time.


(LJ) #9

Just to add - re-publshed minus the no robots txt and all is back to normal within 12 hours. Just searched for female wedding singer Warwickshire and she’s no.3 Phew!


(Doug Bennett) #10

If you uncheck the box in RapidWeaver it’s a noindex no follow meta tag. RapidWeaver doesn’t produce a robots.txt file, just clarifying for people that may read this post. Glad you got it working.


(LJ) #11

That makes sense - however, the reason I realised the issue was precisely because ‘blocked resources’ in Search Console threw up a robots.txt file. I knew I hadn’t added this to the server so checked the RW tabs and sure enough pages were unchecked as stated. I didn’t look in detail and it will have gone now, but sure that’s what it indicated.


(system) #12

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.