Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Very cool idea, and nice site. I noticed that you are serving your own jquery. I've read that it's better to link to Google's host as it is more likely to be cached (and other reasons). Is this a conscious decision on your part, or is it just a part of the puzzle you haven't wrestled with yet? (Honest question- I don't know the right answer because I haven't wrestled with it yet.)

Google jquery link:

  <script type="text/JavaScript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.3/jquery.min.js"></script>


I consciously prefer serving it myself rather than google because if it's a first time visitor, he will be requesting other static files as well from the site, so having a cached copy of jquery leads to minimal loading time differences (although it does save on bandwidth).

However if there isn't a cached copy of google's jquery there is the overhead of a dns query and new http connection to google.

This compared to the already open keepalive from my static server increases load time dramatically.

First impressions count, and you have few vital seconds to make a good one.


You shouldn't have to depend on CDNs as there's no guarantee they'll be up all of the time. I usualayy have a local fallback that can be triggered in this way:

    <script type="text/JavaScript" src="//ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>
    <script>!window.jQuery && document.write(unescape('%3Cscript src="/js/jquery-1.4.2.min.js"%3E%3C/script%3E'))</script>
The second script looks for the jQuery global object that should exist after the CDN fetch. If it doesn't exist, it knows to get your own copy.

(If you're wondering "hey, where's the 'http:' part in that src attribute?", it's because it's a safer way to ask for a resource when you don't know if the you are under http or https.)

Also, you should try to place your <script> tags near the bottom of the <body>, rather than the <head> so that they don't block the rest of the page from loading/rendering.


Have you tested that to make sure that it blocks on the first script tag in every browser?

Over the internet (as opposed to on your dev box), I'd expect that to always evaluate to false and therefore include your local script.

You might want to look into putting that call into window.onload so that it does what you think it does.


The point made (below) by @dspillet is correct, and important for one very good reason:

You will almost definitely have scripts that you have included after these two that depend on the jQuery object existing (otherwise what's the point in having jQuery at all). So imagine this trick wasn't used, and you just served up a local (or CDN hosted) copy of jQuery, then started using it in later scripts. It's reasonable at that point to assume that jQuery exists - which is because each script blocks, or if it doesn't, the browser itself will still make sure they execute in order. So it's perfectly safe to use this script without worrying about the order of things.

This is exactly why I (and many others) suggest that you put all of your <script> tags at the bottom of the <body> element. They block page rendering, so if they're in the <head>, or dotted around the <body>, they're going to delay the presentation of the page to the visitor.


I believe all browsers block execution of the script (and rendering of other proceeding content) so his code should work generally.

Even the latest browsers that do not block further object (scripts requests during the download and execution of the script will execute scripts sequentially, so his check for "is jQuery present" will not fire until the external script has either returned and executed (so the check passes, and nothing else happens) or errored (so jQuery is not present and the document.write executes, making it load from the local resource).


That looks like a neat trick. Thank you.


I can't use external CDNs in my day-job as our clients require certain audits that I doubt the CDN would agree to, though that isn't a problem for this project.

The reason I server my own jQuery (rather than using the CDN-with-local-fallback option given in collypops' reply) even for my own personal projects is the paranoia of not wanting to trust code from an external source. OK so Google's CDN (or any of the other players) is much less likely to get hacked than my personal servers, but their CDN is also much more likely to be the target of a DNS poisoning attack. If an attacker manages to convince many people's machines to send requests for jQuery to them rather than Google via DNS poisoning then any site using jQuery could have unwanted code injected - if I serve my own jQuery file this risk is gone (unless the DNS spoofing attack targets my domain names specifically, of course, but I'm not a big enough fish for anyone to care to try that).


One reason not to use CDN-hosted common scripts is to not share visitors statistics with the CDN. Believe it or not, but it is a valid concern for many businesses.


When it's cached, your browser won't make an HTTP request to the CDN. That's the entire point.

They can do some VERY ROUGH back of the envelope calculations to figure out based on cache-expiry headers and number of requests how many new people you are bringing to JQuery but not much else. Dan Kaminsky proved this earlier in his DNS/TTL cache sniffing tricks.

And by the time you are large enough to have an impact, your audience will be large enough for you to justify using your own JQuery hosted URL.

In short, USE THE CDN JQUERY. :-)


I use MaxCDN and will use it for this project as well.


Yes Google or hosting on another CDN is the best option. Its something I will switch over to.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: