FTP Publishing Problems


(Graham OHare ) #21

Hi Shen, thanks for your tips, like you I was never aware that single pages could be published, I have managed to upload some pages with a smaller number of files, ( sub 100 ) but still failing on the larger file numbers. But it’s good to know I’m not alone with this issue and its not an isolated incident as I thought like you I was slowly going mad after so many years of problem free publishing. RW support are currently working with me to resolve the issue and its good to know they are at least on the case after sending multiple mails to my hosting service to clear their end.


(Andy Francis) #22

Had the same issues, the cure for me was just changing the ‘publishing method’ at the top to ‘SFTP’ and all was well again. Also connections of 4 (fast) and below can often help too. I don’t know if that will solve your issue but it sorted mine out.


(Graham OHare ) #23

Thanks Andy, your tip to switch to SFTP seems to work, it’s not as fast, but at this stage I simply just need to get the sites uploaded and if it takes a little longer, it’s fine. Thanks to all of those who have contributed here, even those who needed to post in a rude fashion which seems more to do with his own issues rather than the bigger picture. Otherwise to the remaining, its a great community I’s much appreciated. As least if anyone else has the same issues with RW publishing we have a work around.


(Doug Bennett) #24

SFTP is always more reliable than FTP. The biggest advantage however is the security. Standard FTP has no security, the credentials (logon, password) are sent in the clear. Easy for hackers with a “sniffer” to grab and take over your site(s).


(Greg Schneck) #25

Hello… A small note… if you use the “cache busting links” option it doesn’t work when single page publishing is used. (at least for me) - I have a 1600 page site and I almost use “single” page publishing exclusively. There are very few times when the entire site needs to be published. If I need “cache busting links” I use the “Export Site” option and then merely upload the appropriate files/folders.

as info… Due to it’s size, my site is broken down into several Project files. Each major section of site is in a project (or two.) I can’t remember the last time I had “publishing” problem.


(Doug Bennett) #26

The cact busting links option has a known bug. It adds a query string to the end of files that gets updated (based on date and time)
?rwcache=507487490
This is being applied to CSS and other files to force the browsers to reread these files that otherwise would have the same name, thus using the cached version.
This is a technique that’s used all over the Internet including sites like https://stackoverflow.com. The problem that has been reported to RapidWeaver as far back as Jan 17’ is RapidWeaver only changes the query string when you publish all files.


(Greg Schneck) #27

I’ve also seen times when the query string was added to a script call causing it not to work. I sometimes use “addthis” service to add social media links. With cache busting links active the ?rwcachexxxxxxx was added to the script call. I can’t run cache busing links on any page were the javascript widget is called. If anyone is interested in this “bug” I’ll be happy to supply sample output code . @dan ??


(Greg Schneck) #28

Example of cache busting bug:

Here is generated output with cache busting OFF:

> <!-- addthis.com -->
> <script type="text/javascript" src="//s7.addthis.com/js/300/addthis_widget.js#pubid=ra-xxxxxxxxxxdb"></script>

Here is the output with cache busting ON (see end line):

<!-- addthis.com -->
<script type="text/javascript" src="//s7.addthis.com/js/300/addthis_widget.js#pubid=ra-xxxxxxxxxxdbdb?rwcache=553099131"></script>

(Doug Bennett) #29

The addthis problem could only be addressed by a method to exclude a script file from the cache busting links. Would probably be pretty complicated to fix this.
I don’t know much about addthis but the allow users to add Parameters(aka query string) to the URL of the script file name. I would guess it’s something to do with tracking options.


So their software is looking for a query string to process. I would guess the only way to fix this would be to somehow have the ability to flag script files that shouldn’t have a cache busting link applied to them. The partial page publishing issue should be a much easier fix.


(system) #30

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.