I have another couple of question with SEO Helper 2…just downloaded as it seems like the penultimate SEO tool for RW now. Great work Joe! @joeworkman
I am using SEO Helper 2 within a Foundry project. @Elixir
Do I need to place the Foundry base stack on the 404 page and also the Sitemap pages?
Should I only make the 404 and Sitemap draft pages (I have them as draft pages at the moment) once the project is completed and the new website is ready to go “live”. I have a draft version but this is on my server as a subdomain. Once the client is happy and ready, it will go live on their server with their actual domain.
Multiple geo locations, if there are multiple locations how do I implement this? I tried placing 2 geo location meta tags but got the pop modal error saying 2 geo tags.
Hi @scottjf Scott,
Someone may correct me on this as I’m a noob myself, but here’s what I’d say:
Yes, put the base stack on the 404 page, but not on the sitemap page.
I’d turn off robots within SEO helper and also, in the subdomain where you’re putting your site for the client to view it, add a robots.txt file with Disallow for the whole directory. Then you can keep all the SEO in there and not worry about being crawled. Then turn it all back on when you hand over.
Not 100% on this. Are you using multiple locations on the same page? If so can you add multiple location ‘structured data’ rather than a geo tag as it will contain all the same info and more anyway? As I say, just a thought and not sure if that’s the best way to do it.
Cheers,
Roger
p.s. ‘penultimate’ means ‘next to last, or second to last’
Draft pages to not get published so make sure that they are not set to draft when the site goes live. I guess this makes sense for the sitemap page. No need for the 404 page really.
Someone else recently asked me this and I could not find the post to reply back. This is a great question that does not have much data on Google that I could find. To my knowledge you can only have one location meta tag on a single page. In order to get both locations recognized, you should have webpages for each individual location. On your homepage, you could add in the main location so that gets indexed.
I did find this article that I thought was interesting.
Ooops…I’m the noob…should have been “ultimate” SEO tool, sorry Joe…not second to last!
Yes, good suggestion with the turning off the robots.
Yes, I was going to place the multiple locations on the same page but I suppose I could do different locations on each page instead. I can’t see a 'multiple location structured data’ in the SEO Helper?
Hey @joeworkman, I’ve updated the server to php 7.2+ but I’m still getting the robots.txt error…
Can I safely ignore that? Or, would I be better off just uploading a text file?
Thanks
You don’t have to have a “robots.txt” file. That’s a very old misconception that search engines like google won’t index a site or give a ranking boost by having a robots.txt file that says allow indexing.
From google
robots.txt is used primarily to manage crawler traffic to your site, and usually to keep a page off Google, depending on the file type
If you aren’t worried about overloading you’re website with search engine crawler traffic or don’t have media type files you don’t want indexed then you don’t need a robots.txt file.
OK cheers @teefers, I was only concerned because web.dev flagged it as invalid. I don’t normally use robots files unless I want to 'Disallow ’ crawlers from development sites etc. I’m more concerned that there is invalid php or something that would hurt ranking…
Number 1 would be nice but impossible @NeilUK, but on the first page would be good, especially for the client who’s paying for the site and wants to see results
robots.txt is not required. SEO Helper does not help create this file. It does help through the robots meta tags on each individual page though. It could an interesting to add though. I will think about it.
@NeilUK SEO Helper is a pretty cool stack. You should try it before you knock it. Its really helped with my SEO on Weaver’s Space.
Any tool that tells you the not having a robots.txt file is an error has lost all credibility with me. So getting a 92% on such a tool doesn’t say much. If you’re looking to “measure” how well your doing look at how well you rank for targeted search phrases.
If you want good search engine results then you might be better off spending your efforts in building a great website. Great websites get found.
I’m not saying to ignore the technical SEO stuff but I think way to many folks put a lot of resources that would probably be better spent on other areas.
Google runs a lot of stuff it’s a huge company and the left hand doesn’t often know what the right hand is doing.
I just ran a test site with one page a imbedded sample video and some Lorem ipsu on it. No meta tags description or anything else dealing with SEO through web.dev and got an 86 on there SEO score.
By the way I don’t have a robots.txt file and didn’t get the read error. Rodger (@rojharris) you sure you don’t have a file?
Hi Doug, I only have what’s generated in Joe’s tool. Each page has a meta tag added that says: meta name=“robots” content=“index, follow”. There is not a separate file.
The site is here: www.kentlogs.co.uk you can look at the page source.
It’s finished and delivered now though so I won’t be changing anything. The client loves it so I’m happy.
Roger,
Sorry about the spelling.
I just tried to link to: http://www.kentlogs.co.uk/robots.txt
And I get your 404 page as it should https://www.kentlogs.co.uk/not-found/
I then ran your site through web.dev and it still shows that as an invalid robots.txt file.
My pretty empty test site has a simple html 404 page and doesn’t have a robots.txt file either. If I try to to link to my robots.txt file it goes to my 404 page(as it should) but when I run my test site through web.dev it doesn’t flag the error.
So either web,dev even though it’s run by google isn’t working or the way the SEO stack generates the 404 page is at error?
I deleted just the ‘robots’ stack from within the SEO helper set I was using, then made a robots.txt file to allow all traffic and plonked it on the server. Now the error has gone and SEO score is 100%