Google's new compression algorithm
Google has recently released a new open-source compression algorithm called Zopfli (thanks Thiago Silvino for the notice). From the four page paper which is authored by Google, Zopfli has better compression ratio than gzip, 7-zip, and kzip, while the compression time is slower - it gets increased by about 10000% (!!). Actually we do not care about compression time at all since that is done just once, when Slax is made. So, could this new algorithm be used in Slax with any benefit?
To answer that question we need to understand what data are we comparing. Slax is using LZMA2 (XZ) compression, which was introduced by 7-zip. LZMA2 is by far the best compression available nowadays. Yet Zopfli claims to provide better compression than 7-zip. It could really seem it is better. But there is a catch. The catch is, Zopfli compression generates data backward compatible with DEFLATE algorithm. When the benchmarks were made, they all compressed some test data to a format which is backward compatible with DEFLATE too. Even when 7-zip application was used, it didn't compress by LZMA2, it did compress by DEFLATE (that is the format you are using in ZIP archives).
So, the answer is no. We can't use Zopfli in Slax, since the compression is insufficient for our needs. But I am sure Zopfli will find its way to the web, where static html pages may be compressed using the improved algorithm, while being still uncompressible with all the existing DEFLATE decompressors used in web browsers.