Help with Joe's SEO helper stack

YOU DONT NEED THE ROBOTS.TXT FILE
You’re getting the error because the testing site is programmed to look for a robots.txt file and if there isn’t one then (even though its not needed) it throws a hissy fit.

SEO scores are meaningless in the real world, they are a guide NOT an official score.

If you’re using the robots stack in the seo tools suite you don’t need a robots file.

There was no need to delete the robots stack from the pages. There is a hierarchy to how its all processed. The robots.txt is like the default values of what Google should index. However, If you define rules at the page level with the Robots stack, that will override whatever is inside of robots.txt.

That’s really not exactly true. If you exclude a site or part of a site with a robots.txt file legitimate crawlers like google, bing, yahoo and others will never crawl the page to read the page level meta tags.

All search engines including google default to index and follow links without being told. They will not index a page that has a no index robots meta tag.
Per google

Note that Google doesn’t index pages with a noindex directive (header or tag). However, it must be able to see the directive; if the page his blocked by a robots.txt file, a login page, or other device, it is possible that the page might be indexed even if Google didn’t visit it!

Now right now google supports some non-standard robots.txt files directives that allow you to do a “noindex” entry in the that file. However earlier this month google announced they are go to start open sourcing there Robots Text parser. In doing so they are dropping Support for any of these non standard directives.

In short a robots text file should be used for preventing ligament crawlers from even crawling a page(disallow). A noindex directive should be used to prevent pages from being index and appearing in SERPS. No index pages are still crawled and links will be followed unless you also add the no follow directive.

1 Like

Hi,

I must admit I am not clear on how to implement the site index correctly in SEO Helper 2

I’ve watched Joe’s video…twice now…1 hour 13 minutes each time as well as read the docs and while the video is long and Joe explains things, it skips over some detail. @joeworkman

These are the results I’m getting from my Google Search Console.

Success is submitting the new sitemap but then the status is ‘couldn’t fetch’

Here are also screenshots on my RW pages…I think I’ve got the correct settings and url’s as I basically followed Joe’s settings in the video.

The 404 page settings seemed to worked a dream…now getting a proper 404 redirect (great work Joe)

Many thanks for any advise and help.

Thanks Scott

Hi Scott, All I have done differently to you is that I don’t have a sitemap index stack, as, like you, I only have one sitemap page. Try getting rid of the ‘index’ stack page and just have the sitemap page with the url’s and link them to the pages too. See if that makes a difference

R

Hi Roger,

I’ll have a look at doing that but I was just following Joe’s video tutorial, so hopefully he can clarify for us. @joeworkman

I’m having the same issue with the sitemap file and Google error “couldn’t fetch” I too followed Joe’s excellent video! Hopefully Joe can shed some light on how to fix?

Cheers,
Tim

Your path to the pages sitemap is wrong. The way that you have it setup in the RapidWeaver page inspector, the actual path is https://rapidwebsites.net/pages-sitemap/

This has nothing to do with the SEO Helper stacks. It’s just how you have your folders defined. Whenever you start the folder name with a slash, you are defining the full path to the folder. If you wanted it to be under sitemap, then you will need to define the folder as /sitemap/pages-sitemap

You can also remove the sitemap.xml path as SEO Helper does not create that. Maybe that was added from when you used to use the RW sitemap.

Hi Joe,

Unfortunately still not working for me, I have checked the URL all seems ok! If I preview the sitemap page in RW it shows the XML file if I look online the page is blank?




Cheers,
Tim

The Sitemap stack outputs xml but the URLs themselves behave just like as if was a webpage. Do not append XML to the url.

Hi Joe, I haven’t appended XML to the URL?
Cheers,
Tim

Publish that to your server and put the links to the 2 sitemap pages here. Remember that these are just light normal webpages and should be named index.php.

Hi Joe, it’s published, the links are - http://www.timstephens.co.uk/sitemap/sitemap.php
http://www.timstephens.co.uk/sitemap_index/index.php

Cheers,
Tim

Also for your info the index page indexes the main site map and the one for the blog, the blog is Wordpress on this site so uses XML file.
Cheers,
Tim

The PHP is crashing on your server. In order to find out why, you will need to find the PHP error_log file. Every host is different. Most of the time the server will create a file called error_log inside the same folder on the server as the webpage. Other times the host may display this data inside cpanel.

Hi Joe,

Logging was disabled! So enabled it and this is the log so far!

[Tue Jul 23 12:53:21.786815 2019] [fcgid:warn] [pid 31562] [client 37.9.113.67:1033] mod_fcgid: stderr: PHP Warning: require_once(): http:// wrapper is disabled in the server configuration by allow_url_include=0 in /home/hp3-linc1-nfs1-w/049/334049/user/htdocs/sitemap/sitemap.php on line 54
[Tue Jul 23 12:53:21.787775 2019] [fcgid:warn] [pid 31562] [client 37.9.113.67:1033] mod_fcgid: stderr: PHP Warning: require_once(http://www.timstephens.co.uk/rw_common/plugins/stacks/seo-helper/seo-helper.php): failed to open stream: no suitable wrapper could be found in /home/hp3-linc1-nfs1-w/049/334049/user/htdocs/sitemap/sitemap.php on line 54
[Tue Jul 23 12:53:21.787790 2019] [fcgid:warn] [pid 31562] [client 37.9.113.67:1033] mod_fcgid: stderr: PHP Fatal error: require_once(): Failed opening required ‘http://www.timstephens.co.uk/rw_common/plugins/stacks/seo-helper/seo-helper.php’ (include_path=’.:/usr/share/pear’) in /home/hp3-linc1-nfs1-w/049/334049/user/htdocs/sitemap/sitemap.php on line 54
[Tue Jul 23 13:07:21.160928 2019] [fcgid:warn] [pid 37189] [client 92.13.130.14:57753] mod_fcgid: stderr: PHP Warning: require_once(): http:// wrapper is disabled in the server configuration by allow_url_include=0 in /home/hp3-linc1-nfs1-w/049/334049/user/htdocs/sitemap/sitemap.php on line 54
[Tue Jul 23 13:07:21.163440 2019] [fcgid:warn] [pid 37189] [client 92.13.130.14:57753] mod_fcgid: stderr: PHP Warning: require_once(http://www.timstephens.co.uk/rw_common/plugins/stacks/seo-helper/seo-helper.php): failed to open stream: no suitable wrapper could be found in /home/hp3-linc1-nfs1-w/049/334049/user/htdocs/sitemap/sitemap.php on line 54
[Tue Jul 23 13:07:21.163455 2019] [fcgid:warn] [pid 37189] [client 92.13.130.14:57753] mod_fcgid: stderr: PHP Fatal error: require_once(): Failed opening required ‘http://www.timstephens.co.uk/rw_common/plugins/stacks/seo-helper/seo-helper.php’ (include_path=’.:/usr/share/pear’) in /home/hp3-linc1-nfs1-w/049/334049/user/htdocs/sitemap/sitemap.php on line 54

Cheers,
Tim

In the advanced project settings, set the File Link setting to be “Relative to Page”

Hi Guys n Gals

Just thought I’d share some feedback after spending a lot of time with Joe’s SEO stacks and implementing them tirelessly into my pages.

The work is not yet finished by a long shot but one of my pages after including 6 sets of structured data and using all known meta tags I have received a ranking as shown in the images below.

The search term “Momentum Harrmony” represents a very large discretionary fund manager firm that is a “household name” in the investment world both in the UK & South Africa.

Prior to using the suite of SEO stacks i was placed nowhere.

Since using the stacks thoughtfully you can note that I now rank in second place behind only Momentum Harmony themselves but ahead of the FT & Nasdaq entries (I’m at least happy with such positioning)

If this is not ‘proof pudding’ that meta and structured data work - I really do not know what further evidence is required.

My hope is that Joe continues to work hard with this sets of stacks and improves further the range and scope of them particularly as it applies to structured data sets and configurations.

As a side note and since using image tags within the sitemap stacks you’ll also note that my images appear in 1st, 6th and 7th positions - again set amongst the competition not bad results.

These are just 2 examples of many I am now witnessing - as I continue to work and focus on each of my 120+ pages I know that further rewards will follow.

Paul

Hi Joe, that fixed the pages and can now view them. But Google still says “couldn’t fetch” I’ve tried re-submitting and no difference?

Cheers,
Tim