Forcing GZIP: http://www.stevesouders.com/blog/2010/07/12/velocity-forcing-gzip-compression/
This would make an excellent WordPress plugin.
Robert Jakobson, Justin Shreve, Matt, and 5 others are discussing. Toggle Comments
I gave it a shot. I follow the method totally to spec. It worked on my personal tests but could probably use some additional testing.
Also nice easter egg hidden in the headers!
I meant to include that I will be submitting it to the plugins directory as well… Now I should get back to my GSoC project!
Just issued a small update to fix something and take use of some feedback from Google.
For the majority of WP users this is less relevant than we may first think, for a simple reason:
Most WP users are on shared-hosting, and gzip support in shared-hosting setups is rare. (Really rare!)
For the rest of the cases, where the server can send gzipped content, I am curious to know what would happen if one sent gzipped content unconditionally, that is, without regard for the Accept-Encoding header and without running any tests first.
gzip support in shared-hosting setups is rare. (Really rare!)
What is your source for this? I can’t agree that it is rare. And do we talk about compression via Apache or via PHP?
I am curious to know what would happen if one sent gzipped content unconditionally, that is, without regard for the Accept-Encoding header and without running any tests first.
In article it is said that they check for cookies first.
You can gzip in PHP with ob_start( 'ob_gzhandler' ).
ob_start( 'ob_gzhandler' )
But that’s not a good option, which is why it is not in core anymore.
Also, that option does nothing for static resources like JS and CSS files, for which great benefits can be expected from gzipping. Two examples: 1. The main CSS of Twenty Ten is 22kB uncompressed and 5kB compressed. 2. The latest, minified jQuery (1.4.2) is 71kB uncompressed and 25kb compressed.
My source is checks that I do myself from time to time. Last time I checked a large number of hosting companies was in Spring 2009. For an indication, at that time none of the hosts recommended at the w.org page supported gzip compression in their shared-hosting offerings.
I would be interested to know if the situation has changed significantly since then, but I am not holding my breath, since that happens for a reason: gzip compression is expensive.
Actually it got removed solely becuase it was impossible to tell reliably if compression was already enabled at another level in the web server stack – either in the Web Server or for all php requests.
That lead to alot of people suffering from double compression when it was enabled.
The problem with compressing output to browsers is that you’re really making a tradeoff. Compression uses less bandwidth, but if you’re compressing on the fly then you’re using more CPU time. On shared hosting solutions, CPU time is generally more limited than bandwidth, so compression like this ain’t always a good thing.
Compression on the fly is only a benefit if your servers are Network I/O bound not CPU bound.
You often do better to focus on one time compression of the things that you can and good caching of output (which could be compressed once)
You’re thinking only from the server side — it’s a benefit from the client side by making things much faster.
It’s a diminishing returns problem. Compressing 20k of HTML down to 4k will indeed be faster, but not if it takes you longer to compress than to send 16k down the pipe.
On the other hand, if you also use a static caching solution to save the compressed stream for serving it up again later, then you can achieve enhancements all around, since you don’t incur a CPU time penalty for compressing every single time.
I think you’re vastly overestimating the time it takes to compress something on the fly.
Time for some ab, but I don’t have an unloaded box handy.
@Matt: If anyone does some tests, they should test both Apache and PHP since some people in their articles recommend only PHP without mentioning Apache’s mod_deflate.
This is a pretty nice tool for testing if Gzip is on, and how much it would save if it was:
If you want to test whole page (including JS ans CSS files) good tools WebPagetest (just click on images on result’s page for detailed information) and Web Page Analyzer.
@Matt: It warms my heart to have you evangelizing performance this way. You made my day.
Compression is nearly always a win if the payload is > 1 kB. My blog post yesterday has a chart ( http://stevesouders.com/images/roundtrips-per-kb.png ) that shows the number of roundtrips required for various payload sizes. Reducing roundtrips, esp. for JS and CSS, makes the page load much faster. Netflix found that turning on gzip compression made pages 13-25% faster ( http://assets.en.oreilly.com/1/event/7/Improving%20Netflix%20Performance%20Presentation.pdf ).
For static JS and CSS files, it is a challenge if the WP user does not have access to their web server config. One alternative would be to have gzipped versions of these files (main.js.gz) and WP could dynamically determine which static file to include.
Thank Justin, not me! 🙂
Need an answer quickly.
← Fixed http://core.trac.wordpress.org/tic…
Made some minor style updates to the wpo… →
Subscribe to this blog and receive notifications of new posts by email.
Join 7,033 other subscribers
See all meetings →