Deploying a static website efficiently on Google Cloud Storage
I have several static websites hosted on Google Cloud Storage for which the only thing missing is the ability to have a custom domain using SSL. In my deploying process from the local file system I was using the gsutil cp command that enables gzipping the files by extension (html,css,js). However each time every file was copied which was a pain when I was using several unmodified images. My first experience with gsutil rsync was bad: I could not gzip the files I needed Just touching a file was causing it to be re-deployed However I was able to find an optimized way for deployment Split my local folder into 2 folders with the same hierarchy, one containing the content to be gzip (html,css,js...), the other the other files Gzip each file in my gzip folder (in place) Call gsutil rsync in for each folder to the same gs destination Of course, it is only a one way synchronization and deleted local files are not deleted remotely For the gzip folder the command is gsu...