How do I create a robots file to ignore certain pages? I use the latest RW and Stacks 3. I’m not a programmer but have managed to create my site with the help of Will Woodgate and Greg Barchard and their software.
thanks!
Great tutorial from RM.
Listen to the latest podcast.
thanks all! So Rapidweaver community is a paid subscription site?
Not everything on the community page is a paid subscription, some videos are some are not.
Hey @barchard Greg, can I please get this link again… it’s dead.
thanks
This looks like the link…
1 Like
Thanks, much appreciated, @abcole
You can use http://tools.seobook.com/robots-txt/generator/ to create robots.txt file and upload it to your website root folder…
@Tour2pondy