Who can tell me how to set up RapidBot so that it allows access only to the main index page (HOME page) and disallows access to everything else in the site? The site in question has a lot of folders and pages, so it would be impractical to disallow every single one of them. Surely, there should be a way to lump them up together, somehow?
If not using RapidBot, how to write a code for robots.txt file that would do the same?
No it should not the way I understand it. Robots.txt files are processed top down.
If it hits an explicide allow it should pass.
I have used simular robots.txt files with sub directories(never tried in the home directory) like this and it works: