Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

ianspangler's avatar

Why does Jeffrey Way not advocate loading files from a CDN in production?

I am wondering why Jeffrey recommends localizing large Javascript and CSS libraries like Bootstrap and JQuery while other sources point out that loading the files remotely boosts performance and load speed and reduces the impact on your own server.

0 likes
20 replies
SaeedPrez's avatar

@ianspangler

  • You only load the file once, after that it's in your browser cache
  • One big file can be faster to load than many small files (read many http requests = many delays)
  • You have more control over local files than external files
jlrdw's avatar

Yeah, just go online and go to some of those sites using CDN's should be fairly obvious. Some of the slowest loading things I've seen in my life.

1 like
d3xt3r's avatar

@SaeedPrez

Beg to differ ...

  1. You only load the file once, after that it's in your browser cache Not the purpose for using the CDN

  2. One big file can be faster to load than many small files (read many http requests = many delays) Again CDN can be set up using the same

  3. You have more control over local files than external files Agree to some extent. But for major changes you will again rely on version control, true with CDN as well

@ianspangler

To me, CDN really comes in picture where you have contents to deliver, like a media heavy website ( e-commerce), however just for few js and css files though you could use CDN, the complexity and cost of setting up a self CDN proxy does not add up together.

If you want you could serve the jQuery bootstrap files directly from hosted CDN. These way your browser can fire multiple simultaneous requests and not be limited by max parallel http connections per site/browser.

Gerard's avatar

I think custom and specific site resources should be loaded from the server, but CDN's also has cache advantages for popular resources, right?

For example, I'm pretty sure that a lot of sites are using the same CDN for jquery or bootstrap resources. So if you use the same CDN for the same resources as other sites, then most of the users will already have this resources on its browser cache when they visit your site, I'm right?

SaeedPrez's avatar

@d3xt3r I'm not against CDNs in any way, but I did make the assumption in this case that he was speaking about using the "default" CDNs for loading popular libraries vs downloading them and using Elixir to compile them into one big file.

And he asked why Jeffrey would use local files, so I listed some of the pros.

ianspangler's avatar

The application in question is both accessed globally and very image-heavy but the images are already loaded through Cloudfront so not much concern with the images right now. I have a total of about 20 JS and CSS plugins/ libraries that are currently being bundled together with Gulp in a single "all.js" file, and one of them (Algolia's instantsearch.min.js) is verifiably huge (400KB) (how they got away with releasing a minified file that big to the public I do not know... but it is an immensely useful library for what needs to be done). So I was thinking of offloading the biggest ones (Algolia, Bootstrap, and JQuery) to CDNs. I can't tell if this strategy will pay off yet. We are not close to releasing anything yet, so there is time to test and decide. I reckon it partly depends on how much we pay for quality hosting. With a cheap host, I am guessing it won't serve those files better than a regionally optimized CDN would.

The other factor/ issue at play is that I am currently setting no-cache HTTP headers on all of the pages. I prefer not to do this, but so far I have not found another way of getting around the back-forward cache of browsers. I wrote about this in another discussion here: https://laracasts.com/discuss/channels/vue/persisting-component-state-when-clicking-back-button-in-browser

If any of you have additional tips tailored to this situation, I would greatly appreciate it! Thanks.

SaeedPrez's avatar

@Gerard yes.

@ianspangler if you have a lot of global visitors and also non-returning visitors, you could benefit and speed up your site by using well known CDNs.

d3xt3r's avatar

To top on that, if already using cloudfront (assuming you have multiple locations enabled for your global users), you may simply choose to server the js file as well through cloudfront. one big bundle ...

bashy's avatar

Where does Jeffrey say this? Just wondering.

SaeedPrez's avatar

In his Laracasts.com code maybe? ☺

<script src="/js/all.min.js?v=137"></script>
1 like
ianspangler's avatar

So would it actually be better to use a self-hosted CDN like Cloudfront or Cloudflare for all JS and CSS, as opposed to separate default CDNs for each major library (ie: AjaxGoogle for JQuery, BoostrapCDN for Bootstrap, etc.)? I haven't tried the first approach before, but I imagine it introduces some added complexities in terms of file management and how files are pushed from local to development to production. Some folks also suggest using local backups in case the CDNs go down... so even more angles I would need to consider.

SimplyCorey's avatar

I think there is a lot of opinion in this topic. Some recommend compiling all of your resources with something like Elixir to reduce the total amount of files that need to be downloaded by a browser. Others, say that you should use popular CDNs like Google's. The thought pattern here is that most likely the required files already exist within the customer's cache locally. This makes it so they do not need to be downloaded again.

I wouldn't overthink this to much unless it has started to really affect the speed of your site. Do what is best for you. Currently, I require in all of the required files through NPM and then have elixir compile them down into a single file which is pushed out through our CDN. (AWS Cloudfront)

SaeedPrez's avatar

Just to clarify, I'm not saying locally hosted files is better than CDN or vice versa. I was simply answering the question why Jeffrey would use locally hosted files by pointing out some of the benefits.

At the end of the day, like it's mentioned here already, each application is unique and you have to decide what is best for your application and your visitors.

jekinney's avatar

Each browser has a limited number of simultaneous downloads at once. Quick search

http://stackoverflow.com/questions/985431/max-parallel-http-connections-in-a-browser

Keep in mind this includes images, CSS and scripts. Though definitely not always attainable I try to keep to 4 per page. Using a generic cdns for basic bootstrap, jquery and a few plugins plus images your done already.

For me I rarely use a full CSS framework. I use only the parts I need plus my custom CSS used through out the site. Gulp it and available on the home page.

Some other page, say a profile page, uses specific CSS just for that page, I create a separate file and using blade inject that file in so its only d/l on that page.

Same for JavaScript. No need to bundle say Dropzone into a main js file for every page. Just inject where needed.

Biggest issue imo is images. Many times people use a 1200 X 1200 and use CSS to resize to 200 X 200. Your still d/l the original image. Use many image sizes instead.

I see sights with 20 plus separate CSS and script files. That's just lazy or ignorance or both.

Cdns are nice if you vertically scale. Then one place for your images etc. I personally use Googles cdn hosting. Less then $5 a month. Fast and easy.

bobbybouwmann's avatar

@SaeedPrez Jeffrey might have done it on Laracasts.com but in his videos he always uses the CDN file. I know he does that to be quick, but it's not a bad practise at all. Browsers are perfectly capable of caching populair CDN sources, which means you don't have to do a request at all ;)

1 like
MikeHopley's avatar

Each browser has a limited number of simultaneous downloads at once.

It's worth noting that this has changed. Many browsers now support HTTP/2, which allows multiplexed streams. This greatly increases parallel downloads.

As a result, some performance best practices are being reconsidered. For example, domain sharding is now an anti-pattern. The benefit of combining files has reduced, as now there is little overhead from the extra HTTP requests; in many cases it's worth keeping the files separate because that will improve caching.

3 likes
ianspangler's avatar

Thanks guys. These are all useful answers so I can't really mark one as "best". @jekinney Thanks for the link. The ETag could be helpful for my caching situation.

1 like
bashy's avatar

@SaeedPrez In no way does that mean he publicly supports or recommends doing it that way... Laravel website is the same.

All depends on the site and the setup. I use CloudFlare for my sites so all resource files are cached and are retrieved from their CDN.

1 like
jekinney's avatar

@MikeHopley The Google link above mentions that. Separating files into commonly updating and almost never updated. But even with the awesomeness no ones mentioned the awesome mobile LTE crappy connections... lol

The only issue I see, like I said with cdn's:

'socket.js',
 'jquery.js', 
  'bootstrap.js', 
'icheck.js',
        'app.js', 
        'dropzone.js',
        'cropper.js', 
        'chart.js',
        'vue.js', 
        'vue-resource.js',
        'main.js',
        'vue/start.js',

boostrap.css,
main.css,
fontawesome.css,
google/fonts,
google/fonts,

<img 1>
<img 2>
<img 3>
<img4>
<img 5>

23 downloads.... LOL

Also look at the 304's still a request being sent, causing delay, though minimal as it isn't downloading the file. But setting a higher time out doesn't send new requests every 120 seconds.

hits

Please or to participate in this conversation.