B4J Question [BANano]: BANano.GZipGeneratedWebsite() after Post Processor?

Mashiane

Expert
Licensed User
Ola

I'm running a post processing script based on Kiffi's post processor

My challenge is that when running my site against website profilers, they recommend that I use gzip for the resources. This means the images, css and js files need to be gzipped for better performance of the website.

The GzipGeneratedWebsite I assume works before any post processing takes place. Is it possible to have this feature to be activated after any post processing is done?

Assumptions:

My app is recorded as having a lot of css and js files in it. Granted. This includes all the css and js files from my BANano based lib that I use for creating my BANano based apps.

After BANano.Build, my post processor ensures that ONLY the css and js files that are used by my project are included in my project and all other ones are removed from the final build. These css files are consolidated into 1 single css and this is the only one added on BANano.Header before the post processing.

The catch though is, if .UseServiceWorker is turned on, all the css and js files added on the Files tab are included. This is irrespective of the post process exercise.

The js files so far are rather tricky as some just break my code if merged into a single file, so I only leave these for compression by the post processor.

So, without having to be manually removing files from the files tab (directly from my banano library), I need to have a way to post process the service worker file css and js list, post process gzip resources etc etc. Phew!

Ta!
 
Last edited:

OliverA

Expert
Licensed User
I'm pretty sure GZipping(?) is a task for the web server hosting your site. Also, the only way to properly reduce JPG output size is via changing the size of the image, changing the quality of the image or reducing colors used in the image. Compressing JPG files again (via GZip) may be counter productive, increasing the size of the generated file.

Links:
https://betterexplained.com/articles/how-to-optimize-your-site-with-gzip-compression/
https://superuser.com/questions/464315/why-is-a-7zipped-file-larger-than-the-raw-file
 

Mashiane

Expert
Licensed User
@OliverA your comments are noted, thanks, BANano.GZipGeneratedWebsite generates these .gz files for each of the resources on your project for css and js files so that the server has them for the needed exchanges.

All of this happens as soon as your resources are copied from the files tab to the build folder of your project but before the post processor.
 

OliverA

Expert
Licensed User
BANano.GZipGeneratedWebsite generates these .gz
I guess I'm asking why the .gz files are generated beforehand, since it has always been my experience/observation/limited knowledge that the web server does this for you. Quote:
On the web, gzipping is done directly by your server. It's a matter of configuring the server to do it. Once that's done, gzipping automatically happens, there isn't any ongoing work you have to do. The server compresses the file and sends it across the network like that. The browser receives the file and unzipped it before using it. I've never heard anyone mention anything about the overhead of zipping and unzipping, so I just assume it's negligible and the benefits far outweigh the overhead.
Source: https://css-tricks.com/the-difference-between-minification-and-gzipping/
If something has changed and you/someone else can provide some information on pre-gzipping files for the server and the benefits, then please do so.
 

alwaysbusy

Expert
Licensed User
@OliverA You are right. It is something I took over from ABM where, way back in the day, jServer couldn't do gzipping yet and doing it manual somehow worked (or at least the site speed tools where somehow fooled, I can't really recall). I just tested it with a BANano site from a client (where gzipping is not enabled on the server site) and it doesn't make any difference (although I could swear in the far past when I've build it in ABM, it somehow did)

But getting old and memory and stuff, you know... :D
 
Last edited:

OliverA

Expert
Licensed User
Nginx can use this when you set gzip_static to ‘on’.
Thanks! Looks like I learned something new. I can see such a setting for Nginx, since it tries to be light an limber, whereas something like Apache is like having everything, including kitchen sink. One of the reasons (in the past) to let the server handle the compression is that it will compress all files that it can (html, txt, etc.), leaving that burden to the server instead of the developer. Some (this is from the top of my head) servers skip compressing files such as JPG files due to the negligible (or even worse) outcome of the compression on these files (either out of box or through configuration).
 
Top