Ensuring Latest site is loaded

I just wondered if there is a way to ensure the latest version of a site shown.
I understand the browser will save certain elements and I just wondered if there is a way round this.

Here is the issue:
We publish info on a page and client looks at it.
We then update the info as something has changed, upload new site but client gets original page once again.

I get same behaviour in my browser if I upload a new version of the site, the browser will still seem to always get the old version until I refresh.

I hope this makes sense an appreciate any suggestions

Thanks
Russell

Do you have a live URL that we can look at?

You have to clear the browser cache first.

1 Like

Always preview in private browsing mode, should automatically not cache (a term where the browser remembers the previous version of the site to speed up loading for returning visitors) the site and load the “newest version”.

Also you could put a version number or last updated bit of info on the site so your client knows which version they are looking at.

Another option could be to use a password protected sub domain for unreleased new versions of the site, so only the client can see the new changes before approving them so you can upload to the public site.

1 Like

All browsers store a ‘cached’ version of pages you have visited in order to open them faster. This causes the problem you are experiencing.

When you are making changes and want to check them live, one solution is to use the browser in ‘private’ mode as nothing is cached. However, your client may need to clear recent browser cache before viewing the latest version. It seems to be hit and miss - sometimes changes show anyway and other times they don’t. There may well be some logic to this but I don’t know any.

www.gaht.ch

I understand the Cache but I am also thinking about a lot of sites I visit where I would always see an up to date site.

I assume this is a type of ‘dynamic content’ for want of a better word.

I get how a blog would work so the latest posts show, but surely there is a way to force refresh on loading?

Check out the first article here:

I’ve used similar while in client approval mode. Don’t forget to remove (or at least review) when the site goes into production.

3 Likes

But if I remove that Metatag when I publish then it seems to defeat the original purpose :thinking:

I don’t follow you too well.

You can add the meta tag to disable caching while you are in site development mode, that way the client is going to see an uncached version of the site. So far so good?

Then when the site is completed you can then decide on a caching strategy that works.

Make sense?

1 Like

If you are running on Apache I wouldn’t mess with meta tags. It’s so much easier to add cache control to the HTTP header with the htaccess file directives.

You want to cache for most files, but even with a production site, you may not want to cache the HTML or PHP files. They are usually tiny in size when you compare them to images(jpeg, SVG, gif, etc.), fonts(WOFF,woff2), CSS, and Javascript.

Of course, it’s up to you on what you want to cache and for how long. Here is a sample of what you can add to the htaccess file. It’s commented (comments start with a # in Apache). You can change the values to what you want.

#
# BEGIN Cache  max-age values are seconds
#
<ifModule mod_headers.c>
# For images and "data" type files set to 30 days
<filesMatch "\\.(ico|pdf|flv|jpg|jpeg|png|gif|swf|ttf|otf|woff|woff2|eot|svg)$">
Header set Cache-Control "max-age=2592000, private"
</filesMatch>
# For css set to 7 days
<filesMatch "\\.(css)$">
Header set Cache-Control "max-age=604800, private"
</filesMatch>
# For JavaScript set to 7 days
<filesMatch "\\.(js)$">
Header set Cache-Control "max-age=604800, private"
</filesMatch>
# For xml and txt set to 1 day and must revalidate
<filesMatch "\\.(xml|txt)$">
Header set Cache-Control "max-age=86400, private, must-revalidate"
</filesMatch>
# For the html and php files set to 1 second and must revalidate
<filesMatch "\\.(html|htm|php)$">
Header set Cache-Control "max-age=1, private, must-revalidate"
</filesMatch>
</ifModule>
<FilesMatch "files/image_stack_img_487313\\.jpg$">
    Header set Cache-Control "max-age= 3600, private, must-revalidate"
</FilesMatch>
# END Cache
3 Likes

yeah… thats much simpler and way safer…

3 Likes

@indridcold Nice article. The article states:

This meta tag is recognized in Firefox, Chrome, and Internet Explorer.

Does this mean the meta tag does not work with Safari? (In my case that would eliminate about 35% of my viewers.)

1 Like

The Hongkiat articles are quite good for finding a foothold into something useful but rarely contain the meatier narrative about the topic. Its a great question and not a topic that I’ve any special knowledge about, other than having used meta tags in the past to avoid agonising exchanges with clients where they can’t see your latest changes and you have to talk them through clearing their cache etc.

I like MDN as a source as it is usually very well written and referenced; I find it easier to understand. Currently the MDN seems to suggest that cache-control tags are supported in Safari:

There are two supporting articles around cache-control that were pretty digestible. This is the first:

There is a really good article which breaks down the methods around cache-control here:

Like I say, its not a subject I have done anything other than tinker with a few times. It just seemed to me that if meta tags got the OP what they needed without having to crack open their .htaccess file then that’s probably a good thing.

2 Likes

@indridcold Many thanks to these articles. I have not used meta tags for something like cache control so your response piqued my interest. Using this approach for a few web pages will prove to be quite helpful to me. Again: thanks!

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.