The Importance (and Ease) of Minifying your CSS and JavaScript and Optimizing PNGs for your Blog or Website
Hello Dear Reader. You may feel free to add a comment at the bottom of this post, something like "Um, DUH!" after reading this. It's funny how one gets lazy with their own website. What's the old joke, "those who can't, teach." I show folks how to optimize their websites all the time but never got around to optimizing my own.
It's important (and useful!) to send as few bytes of CSS and JS and HTML markup down the wire as possible. It's not just about size, though, it's also about the number of requests to get the bits. In fact, that's often more of a problem then file size.
First, go run YSlow on your site.
YSlow such a wonderful tool and it will totally ruin your day and make you feel horrible about yourself and your site. ;) But you can work through that. Eek. First, my images are huge. I've also got 184k of JS, 21k of CSS and 30k of markup. Note my favicon is small. It was LOT bigger before and even sucked up gigabytes of bandwidth a few years back.
YSlow also tells me that I am making folks make too many HTTP requests:
This page has 33 external JavaScript scripts. Try combining them into one.
This page has 5 external stylesheets. Try combining them into one.
Seems that speeding things up is not just about making things smaller, but also asking for fewer things and getting more for the asking. I want to make fewer request that may have larger payloads, but then those payloads will be minified and then compressed with GZip.
Optimize, Minify, Squish and GZip your CSS and JavaScript
CSS can look like this:
body {
line-height: 1;
}
ol, ul {
list-style: none;
}
blockquote, q {
quotes: none;
}
Or like this, and it still works.
body{line-height:1}ol,ul{list-style:none}blockquote,q{quotes:none}
There's lots of ways to "minify" CSS and JavaScript, and fortunately you don't need to care! Think about CSS/JS minifying kind of like the Great Zip File Wars of the early nineties. There's a lot of different choices, they are all within a few percentage points of each other, and everyone thinks theirs is the best one ever.
There are JavaScript specific compressors. You run your code through these before you put your site live.
- Packer
- JSMin
- Closure compiler
- YUICompressor (also does CSS)
- AjaxMin (also does CSS)
And there are CSS compressors:
- CSSTidy
- Minify
- YUICompressor (also does JS)
- AjaxMin (also does JS)
- CSSCompressor
And some of these integrate nicely into your development workflow. You can put them in your build files, or minify things on the fly.
- YUICompressor - .NET Port that can compress on the fly or at build time. Also on NuGet.
- AjaxMin - Has MSBuild tasks and can be integrated into your project's build.
- SquishIt - Used at runtime in your ASP.NET applications' views and does magic at runtime.
- UPDATE: Chirpy - "Mashes, minifies, and validates your javascript, stylesheet, and dotless files."
- UPDATE: Combres - ".NET library which enables minification, compression, combination, and caching of JavaScript and CSS resources for ASP.NET and ASP.NET MVC web applications."
- UPDATE: Cassette by Andrew Davey - Does it all, compiles CoffeeScript, script combining, smart about debug- and release-time.
There's plenty of comparisons out there looking at the different choices. Ultimately when compression percentages don't matter much, you should focus on two things:
- compatibility - does it break your CSS? It should never do this
- workflow - does it fit into your life and how you work?
For me, I have a template language in my blog and I need to compress my CSS and JS when I deploy my new template. A batch file and command line utility works nicely so I used AjaxMin (yes, it's made by Microsoft, but it did exactly what I needed.)
I created a simple batch file that took the pile of JS from the top of my blog and the pile from the bottom and created a .header.js and a .footer.js. I also squished all the CSS, including my plugins that needed CSS, and put them in one file while being sure to maintain file order.
I've split these lines up for readability only.
set PATH=%~dp0;"C:\Program Files (x86)\Microsoft\Microsoft Ajax Minifier\"All those ones at the bottom support my code highlighter. This looks complex ,but it's just making three files, two JS and a CSS out of all my mess of required JS files.
ajaxmin -clobber
scripts\openid.css
scripts\syntaxhighlighter_3.0.83\styles\shCore.css
scripts\syntaxhighlighter_3.0.83\styles\shThemeDefault.css
scripts\fancybox\jquery.fancybox-1.3.4.css
themes\Hanselman\css\screenv5.css
-o css\hanselman.v5.min.css
ajaxmin -clobber
themes/Hanselman/scripts/activatePlaceholders.js
themes/Hanselman/scripts/convertListToSelect.js
scripts/fancybox/jquery.fancybox-1.3.4.pack.js
-o scripts\hanselman.header.v4.min.js
ajaxmin -clobber
scripts/omni_external_blogs_v2.js
scripts/syntaxhighlighter_3.0.83/scripts/shCore.js
scripts/syntaxhighlighter_3.0.83/scripts/shLegacy.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushCSharp.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushPowershell.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushXml.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushCpp.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushJScript.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushCss.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushRuby.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushVb.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushPython.js
scripts/twitterbloggerv2.js scripts/ga_social_tracker.js
-o scripts\hanselman.footer.v4.min.js
pause
This squished all my CSS down to 26k, and here's the output:
CSS
Original Size: 35667 bytes; reduced size: 26537 bytes (25.6% minification)
Gzip of output approximately 5823 bytes (78.1% compression)JS
Original Size: 83505 bytes; reduced size: 64515 bytes (22.7% minification)
Gzip of output approximately 34415 bytes (46.7% compression)
That also turned 22 HTTP requests into 3.
Optimize your Images (particularly PNGs)
Looks like my 1600k cold (machine not cached) home page is mostly images, about 1300k. That's because I put a lot of articles on the home page but I also use PNGs for images most of my blog posts. I could be more thoughtful and:
- Use JPEGs for photos of people, things that are visually "busy"
- Use PNGs for charts, screenshots, things that must be "crystal clear"
I can also optimize the size of my PNGs (did you know you can do that!) before I upload them with PNGOUT. For bloggers and for ease I recommend PNGGauntlet, which is a Windows app that calls PNGOut for you. Easier than PowerShell, although I do that also.
If you use Visual Studio 2010, you can use Mad's Beta Image Optimizer Extension that will let you optimize images directly from Visual Studio.
To show you how useful this is, I downloaded the images from the last month or so of posts on this blog totaling 7.29MB and then ran them through PNGOut via PNGGauntlet.
Then I took a few of the PNGs that were too large and saved them as JPGs. All in all, I saved 1359k (that's almost a meg and a half or almost 20%) for minimal extra work.
If you think this kind of optimization is a bad idea, or boring or a waste of time, think about the multipliers. You're saving (or I am) a meg and a half of image downloads, thousands of times. When you're dead and gone your blog will still be saving bytes for your readers! ;)
This is important not just because saving bandwidth is nice, but because perception of speed is important. Give the browser less work to do, especially if, like me, almost 10% of your users are mobile. Don't make these little phones work harder than they need to and remember that not everyone has an unlimited data plan.
Let your browser cache everything, forever
Mads reminded me about this great tip for IIS7 that tells the webserver to set the "Expires" header to a far future date, effectively telling the browser to cache things forever. What's nice about this is that if you or your host is using IIS7, you can change this setting yourself from web.config and don't need to touch IIS settings.
<staticContent>
<clientCache httpExpires="Sun, 29 Mar 2020 00:00:00 GMT" cacheControlMode="UseExpires" />
</staticContent>
You might think this is insane. This is, in fact, insane. Insane like a fox. I built the website so I want control. I version my CSS and JS files in the filename. Others use QueryStrings with versions and some use hashes. The point is are YOU in control or are you just letting caching happen? Even if you don't use this tip, know how and why things are cached and how you can control it.
Compress everything
Make sure everything is GZip'ed as it goes out of your Web Server. This is also easy with IIS7 and allowed me to get rid of some old 3rd party libraries. All these settings are in system.webServer.
<urlCompression doDynamicCompression="true" doStaticCompression="true" dynamicCompressionBeforeCache="true"/>
If there is one thing you can do to your website, it's turning on HTTP compression. For average pages, like my 100k of HTML, it can turn into 20k. It downloads faster and the perception of speed by the user from "click to render" will increase.
Certainly this post just scratches the surface of REAL performance optimization and only goes up to the point where the bits hit the browser. You can go nuts trying to get an "A" grade in YSlow, optimizing for # of DOM objects, DNS Lookups, JavaScript ordering, and on and on.
That said, you can get 80% of the benefit for 5% of the effort by these tips. It'll take you no time and you'll reap the benefits hourly:
- Minifying your JS and CSS
- Combining CSS and JS into single files to minimize HTTP requests
- Turning on Gzip Compression for as much as you can
- Set Expires Headers on everything you can
- Compress your PNGs and JPEGs (but definitely your PNGs)
- Use CSS Sprites if you have lots of tiny images
I still have a "D" on YSlow, but a D is a passing grade. ;) Enjoy, and leave your tips, and your "duh!" in the comments, Dear Reader.
Also, read anything by Steve Souders. He wrote YSlow.
About Scott
Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.
About Newsletter
http://thisthattechnology.blogspot.com/2010/08/enable-iis-compression-on-ria-wcf.html
For versioning my css and js files I created a MVC Helper in my project called Url.StaticContent, that works the same way as Url.Content, except is appends the Ticks of the last write time for the requested file to the url as a parameter. I find this better than renaming the files because I have less places to "fix" later.
I was thinking about merging the js files (and css too) into one, but than I'll have to change my _layout page and since I'm using AjaxMin only in the build server, I would be forced to have all the original files as well in order to be able to test and debug localy. I'm trying to figure out a good way to do this, but couldn't find anything to my liking yet.
Anyway, great post.
Regards,
Kelps
Overall grade was a D at 69 out of 100.
http://www.flickr.com/photos/adriarichards/6104651958
Grade F on Make fewer HTTP requests
Grade F on Use a Content Delivery Network (CDN)
Grade A on Avoid empty src or href
Grade F on Add Expires headers
Grade D on Compress components with gzip
Grade A on Put CSS at top
Grade F on Put JavaScript at bottom
Grade A on Avoid CSS expressions
Grade n/a on Make JavaScript and CSS external
Grade F on Reduce DNS lookups
Grade A on Minify JavaScript and CSS
Grade A on Avoid URL redirects
Grade A on Remove duplicate JavaScript and CSS
Grade F on Configure entity tags (ETags)
Grade A on Make AJAX cacheable
Grade A on Use GET for AJAX requests
Grade D on Reduce the number of DOM elements
Grade A on Avoid HTTP 404 (Not Found) error
Grade B on Reduce cookie size
Grade A on Use cookie-free domains
Grade A on Avoid AlphaImageLoader filter
Grade C on Do not scale images in HTML
Grade A on Make favicon small and cacheable
I also tried out the beta ySlow for Chrome and that reported the same information https://chrome.google.com/webstore/detail/ninejjcohidippngpapiilnmkgllmakh
Scott, I'm surprised you didn't mention Combres. I'm sure I've seen you mention it before:
http://combres.codeplex.com/
Nuget package that you drop onto your site, configure the css / js to use, and it will handle combining / minifying and gzipping.
I roll it out on all my sites.
My entire team use it and we love it.
Scott Forsyth has a nice post about the settings (http://weblogs.asp.net/owscott/archive/2009/02/22/iis-7-compression-good-bad-how-much.aspx)
If someone can talk to the aspnetcdn guys to fix their setting, i would appreciate it. I tried few times.
http://code.google.com/speed/page-speed/docs/caching.html
'Set Expires to a minimum of one month, and preferably up to one year, in the future. (We prefer Expires over Cache-Control: max-age because it is is more widely supported.) Do not set it to more than one year in the future, as that violates the RFC guidelines.'
Because, for some reason, the MIME type application/json is not compressible using urlCompression. I think you can get it to work if you use text/json instead, but that's a workaround.
Rick Strahl has some information: http://www.west-wind.com/weblog/posts/2011/May/05/Builtin-GZipDeflate-Compression-on-IIS-7x
I use for my projects Combres (http://combres.codeplex.com/), which I think is similar to SquishIt and is really easy to use and has gzip, minfy and a kind of asset management.
A good starting point to customize your web.config is the web.config from the HTML5BoilerTemplate (http://html5boilerplate.com/) - there are some really nice things in it (GZip, compression, caching).
As an add-in for Visual Studio, not only does it convert SASS, LESS and CoffeeScript and offer combining and minifying, handiest of all (IMO) it does this in the background each time you save a file so if you wish you can develop while using the combined-minified files.
On the flip-side, I'd like to find out how to implement versioning in Chirpy (in an automated sense).
I'd like to second (third?) the shout out to Chirpy and Combres. These two elements are dead simple to use and improve performance incredibly, our site footprints have decreased to around 1/3 of their previous size. For the amount of effort involved to implement these two are a massive ROI.
We also do all of our own g-zipping of ActionResults using a custom AttributeFilter that compresses the response OnActionExecuting which gives us complete control over what we compress.
These simple things only take a day or so to setup and you can port the functionality between projects with minimal fuss.
can be seen here: https://bitbucket.org/DotNetNerd/minime/overview
Just to let you know :)
Sure, but that's only for static content, for which <urlCompression /> is just fine. I'm talking about dynamically generated JSON (application/json). doDynamicCompression = true will NOT gzip application/*, only text/*.
But yes, that article raises a good point. I've always used this for IIS7.x in my Web.config, which is pretty much the same as the article you link to proposes (minus the caching).
<system.webServer>
<urlCompression doDynamicCompression="true" doStaticCompression="true" dynamicCompressionBeforeCache="true" />
<staticContent>
<remove fileExtension=".js" />
<mimeMap fileExtension=".js" mimeType="text/javascript" />
<clientCache cacheControlMode="UseMaxAge" cacheControlMaxAge="365.00:00:00" />
</staticContent>
</system.webServer>
That MIME-type re-mapping must be done for IIS7.5 to compress .js files. Not sure about IIS7, but you probably want to update the article with this, Scott. If you don't do this remapping, IIS7.5 will, for some reason unbeknownst to me, use an application/x-javascript mimetype and it won't gzip.
Webpage error details
User Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET4.0C; .NET4.0E; MS-RTC LM 8; InfoPath.3)
Timestamp: Fri, 2 Sep 2011 13:36:53 UTC
Message: Expected identifier
Line: 2
Char: 9
Code: 0
URI: http://www.hanselman.com/blog/TheImportanceAndEaseOfMinifyingYourCSSAndJavaScriptAndOptimizingPNGsForYourBlogOrWebsite.aspx?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+ScottHanselman+%28Scott+Hanselman+-+ComputerZen.com%29
Am I missing something unblushingly obvious here?
(ugh, this solution reminds me of my If False Then block that I have to put into my MVC3 projects because you don't get intellisense with Url.Content *hint hint*... while I'm at it, it would be neat if "Url.Content(" brought up a nifty "Browse..." helper the way typing "src=" does... is this for The Gu? How did I get here?)
Do you just have to skip CDN on a secure site?
We've been investigating how to improve the compression ration on the MS Ajax CDN for a while now but unfortunately it's not straightforward. The CDN is run on the Azure CDN infrastructure, so we have absolutely zero control over these types of settings, and any changes they would make for us affect many, many sites. We're continuing to push to improve it though (we changed the URL to ensure there are no cookies for example) and hope to get the compression turned up to 11 some day in the future.
Check out this article:
http://encosia.com/cripple-the-google-cdns-caching-with-a-single-character/
Another way to decrease the footprint of your site that seems to get overlooked a lot is to load scripts only when needed.
A lot of developers just add every single script that gets used somewhere into a template. In practice there may be few that are used on every page (e.q. jQuery). Many site users may never even go to pages or features that need some of your scripts. Set up an architecture that lets you specify what is required on each page, and load only those scripts.
Finally you can load rarely-used scripts on demand. For example, I have a web site that only loads the authentication script (which includes SHA2 encryption code, and so on) when a user actually clicks "login." The actual process of logging in only takes place once per user session - why keep that script on every page? Just use (e.g.) jQuery getScript and load such things on demand. As long as there is not a need for instantaneous response, this will be invisible to the user. That is - logging in requires handshaking with the server anyway, so this doesn't impact the perception of performance.
"Give the browser less work to do, especially if, like me, almost 10% of your users are mobile. Don't make these little phones work hard than they need to and remember that not everyone
Let your browser cache everything, forever"
Another related topic worth mentioning is image spriting. Might check out RequestReduce which sprites background images on the fly to optimized 8 bit PNGs as well as minifies and merges CSS. It brought my blog from a YSLOW C to an A with practically 0 effort.
Matt Wrock
Anonymous Fan - Fixed, thanks!
Jon - I think the Chief Performance Engineer at Google would disagree with you. ;)
maybe I'm a bit late on this but I still hope to get an answer.
Is it recommended to use Base64 encoded images?
Just loudly thinking: If I have the same image multiple times on my page in the base64 format a benefit could only be seen when using GZIP compression, is this correct?
Otherwise the image would be transmitted twice (as a string within the markup).
In this case "normal" png's for example are the preferred way I guess, because it gets transmitted only once and then used from cache (the second time).
(My) conclusion: Base64 should only be used in coherence with GZIP.
Why I'm asking this is because I want to reduce http requests but find it relatively hard creating and working with sprites.
In which way could the use of Base64 replace the sprite technology?
Additional thoughts? I'd really appreciate it.
I agree with the above poster that notes "combining a lot of scripts into one is not necessarily the best way to improve performance."
Nowadays, with HTTP/1.1, we have such a thing as Keep-Alive. That means there is a negligible difference between loading 1 script from your server, and loading 10 (I'm talking latency, not bandwidth).
The benefit of separate scripts, as the above poster notes, is that they can load independently on only the pages they are required, and they're cached independently.
Of course, you should still be versioning, minifying, and zip'ing them!
I went with Combres. Very nice!
In the case of running an intranet site (lets say 200 users) could one get away with just "caching things forever"? This would seem to just make upgrades easier and really only take the performance hit on each users first page visit.
Thanks good article!
I have couple of question....
1) Lets say for example commercial website has lots of code that has more css, javascript and few html files, so after minifiying these above mentioned files will there be any exceptions that can arise if the minified file is deployed on the webserver where the code resides or will there be environment challenges or other?
2) Apart from the advantages is there any disadvantage for Minifiying a Javascript or a css or even a html file?
1. Is there a YSlow like tool to work with IE? If not what other tool/process we can follow to analyze a web page on IE?
Thanks
Raghu
Nice article, just implemented the some of the techniques discussed here in my existing website. I used squihit
Very good article!
Just curious and don't really expect an answer, but how does PNG compare to GIF? I often use GIF for other purposes (email attachments, etc.) when the picture is suitable for this lossless format (when it has large blocks of solid color, basically).
Does PNG support 'GIF-like' compression with similar performance even if the original is not a vector-based image?
//Sten
Comments are closed.